Electronic Computers

Electronic Computers

Since olden times, humans have invented practical mnemonic rules to speed up calculations on numbers – and ensuring that the results are correct. These were of varying complexity depending on the system used to represent numbers: very complex with Roman numbers and rather straightforward with Arabic numbers, but they all practically relied on the application of some blind rules. Therefore it was only natural that thinkers of different times should entertain the idea of building machines capable of performing calculations automatically by codifying those rules into the operation of a machine. 

In modern times some of those individuals were Napier (16th century), Pascal and Leibniz (17th century) and Babbage (18th century). Concrete and lasting developments, however, only happened in the 20th century when several mechanical and electromechanical machines were built, among these the so-called Harvard Mark I. ENIAC, was built entirely with vacuum tubes immediately after World War II and can probably be taken as the beginning of the age of electronic computers. 

Computers were born as machines capable of processing binary data (but I used one that employed ternary data during an internship at KDD Labs. in the mid 1960s) based on an internal processing architecture. The basic common elements of the architecture are the RAM and the Central Processing Unit (CPU). The CPU has permanently wired instructions, the basic commands that the machine executes such as comparing two memory locations and performing certain operations depending on the > or = or < value of the comparison, or calculating the square of the number and returning the result in a register. The programs, i.e. the sequence of instructions written by the programmer to achieve some predetermined goal, are stored in the RAM, but the program instructions themselves can be altered during the execution of the programs. 

Beyond these common general architectural elements, usually referred to as “von Neumann architecture” from the name of the mathematician who first gave a systematic treatment of the topic, computer designers made their own decisions concerning the specific elements: the set of instructions, the organisation of bits in bytes and words, the number of bits in a byte, the number of bytes to be used in the representation of integers and real numbers, the number of bytes used to transfer data from one part of the computer to another, etc. 

Mass storage devices, such as disks and tapes – even paper tapes – were used as input/output (I/O) devices with the feature of having a large capacity and of retaining the data in a permanent way. Therefore they could also be used to “load” programs or portions of programs at “run time”. Computers also had a variety of other input devices, such as keyboards that operators could utilise to input numbers or characters, and output devices, such as printers, that could be used to communicate the results of the calculations. 

For the first 20 years electronic computers were very large machines each costing million USDs. They were housed in large air-conditioned rooms, usually equipped with a large number of peripherals and attended by a host of specialists. These “vestals” reborn for the computer age were the only ones allowed into the “penetralia” of the computer room while computer users, resembling more like day labourers in a queue than scientists, were forced to wait in line with their card decks, hand them over to the specialists who, in due course, would give back a printout with the result of the execution of the program after “jobs” had been assigned at their own inscrutable discretion. 

The evolution of the computer from “big iron”, when one needed a surface comparable to a warehouse to run the machine, to today’s almost infinite computer manifestations – large and small – has been truly exciting. The use of transistors first and integrated circuits later progressively reduced the size while increasing the number crunching and random-access storage capabilities. This success story can be explained by the fact that computers were the tool that enabled the solution to problems that would otherwise not have been solved, or solved at much higher cost, including scientific calculations, payrolls, client orders, and inventory management. 

As the breadth of new application domains became clear, one could see an ever-increasing number of types of computers made by an increasing number of companies who saw computing technology as an opportunity of a “we must be there” business. The large number of computer companies in the first 20 years of electronic computing and the very reduced number today would seem to support the idea that what was at work was an almost perfect implementation of the Darwinian process of selection of the fittest. 

Every company who undertook to enter the electronic computer business started from scratch developing their implementation of a von Neumann machine, adding peripherals, writing system and application software and selling computing machines with new features and models at an accelerated pace. Some of those machines are still in the memory of those who used them and admired for the excellence of the technical solutions adopted in those early times. 

But was this the triumph of a Darwinian process applied to competing companies? I dare say no. Towards the mid 1980s computer manufacturing saw IBM as the largest computer company, a position that it had easily created for itself since the early days of the electronic computer age. The next entry in the list was Digital Equipment Corp. (DEC), a company with a size one order of magnitude less than IBM’s. The conclusion is easy to draw: these fascinating results were achieved not because there was competition, but because of the size of IBM and its already dominant position in the market of “Tabulating Machines” that IBM already occupied before the advent of the electronic computer age. 

This dominant position allowed IBM to mobilise its undoubtedly excellent research, design and manufacturing capabilities – an easier thing to do thanks to the rich revenues that the company enjoyed – respond to the needs of its existing clientele and conquer an increasingly larger share of the nascent new computer market. If IBM did not become eventually the only game in town it was because of the investigation of the USA Department of Justice – hardly a component of a Darwinian process – that paralysed the company for years and changed its attitude forever. The lack of competition, caused by the dominant position of IBM, forced people who could not compete with computers that were “better” for the existing categories of customers, to make computers that were “different” i.e. “smaller” (compared to the mainframes of that time) to serve new categories of customers. 

In the late 1960s (and for quite some time afterwards) those buying computers were the big corporate data processing departments that derived considerable power from the fact that the data processing needs of any department had to be fulfilled by their very expensive corporate data processing machines. IBM and the other mainframe computer vendors provided computing solutions matching this organisational structure. 

The Digital Equipment and Data General start-ups attacked other less structured and more individualistic entities such as university departments and research laboratories, and started installing small (by the standard of those times) computers (hence the name “minicomputer”) that did not even require air conditioning and at a cost a fraction of the price of a mainframe. IBM had its hands tied against this competition: if it had engaged in providing competitive minicomputer solutions, it would have lost the support of its rich corporate clients and undermined the market for its own primary products. 

So in the early 1960s the “minicomputer” started making inroads in company departments and started a process that gradually deprived the managers of the big centralised computers of their power. “Now you can own the PDP-5 computer for what a core memory alone used to cost: $27,000”, ran one 1964 advertisement by Digital Equipment (later acquired by Compaq Computer, itself later acquired by Hewlett Packard, which will be shedding its PC business soon). The Programmed Data Processors (PDP) series of minicomputers (PDP-8 and PDP-11) was very successful and brought cheap computing within the reach of small research groups, including mine at CSELT. 

In the early 1970s the time was ripe for yet another round of computer downsizing. The progress of circuit integration produced the microcomputer, a single silicon chip containing a complete CPU with RAM and some standard interfaces. The Altair computer by Micro Instrumentation Telemetry Systems (MITS) was the first commercially successful PC. It used the Intel 8080 8-bit microprocessor and barely (with today’s eyes) 256 bytes of RAM. It was designed as a kit for hobbyists and professionals to build and use in their home or work environments. 

This created the conditions for the mass introduction of the first PCs, home computers and game machines: Commodore, Atari and Apple and a host of other PC makers were established. Some of these shined for a while until they fell into oblivion: Apple Computers’ Apple II and Commodore’s Amiga were very successful. The former was instrumental in creating the PC image that we know today and the latter remains the emblem of the home computer making some think of what would be the world today if Amiga and not the PC had taken over in the home.

IBM identified the PC as an ideal opportunity for a new business that would eventually leapfrog the minicomputer industry. IBM designed and developed its PC giving authority to the group developing it to use whatever components they thought suitable for it, if necessary defying the iron internal procurement rules of the company. So in 1981 IBM unveiled the PC XT that used the then powerful 8088 8-bit Intel microprocessor, later to be replaced by the 8086 16-bit Intel microprocessor and the various elements of the extended family that later became known as Intel x86. The other innovative idea that the team adopted was to use openly available components for the hardware design so that anybody could build PC “clones”. 

In the second half of the 1980s Apple released the Macintosh that used the powerful Motorola 68000 16-bit microprocessor. So, throughout the late 1980s and the early 1990s, an increasing number of adepts got an IBM PC or a Macintosh and left the serfdom of terminals connected to the corporate or departmental mainframe. I was probably the first in my company to use an IBM PC XT in the office, but I used it in “dual mode” because it was also connected to the company mainframe in terminal mode. The side effect was that the PC started making inroads into the small office and even the home, as Amiga had disappeared because Commodore had gone out of business. 

Today the types of CPU in sizeable use in the PC environment is rather small. Among these towers the Intel CPU of which several generations have already seen the light and which has displaced the PowerPC from the Mac. This is the time to check the validity of yet another holy cow of technology, i.e. that competition makes technology progress. This is simply not true, not because competition is harmful, but because it is “expensive” and in most cases is irrelevant to reaching the goal. The CPU in largest use today in the PC world – and one that has created the largest silicon company in the world – is the offspring of a CPU that was designed 35 years ago. Other, possibly “better” CPUs did appear and for some time seemed to have the upper hand but no one could resist the x86 CPU because of the size of the almost captive market for the IBM PC CPU. Now, with the move by Apple to adopt the Intel CPU for the Mac, the dominance in the PC space is almost complete.

A new story began to unfold a few years ago with the development of the mobile handset, portable player, multimedia display and set top box markets. Here ARM plays a similar dominant role as Intel: ARM does not manufacture CPU chips but licences the design of its CPU core to manufacturers who offer different CPU chips all built around the same CPU core. Like in the mainframe vs. minicomputer case, evolution happens not because of competition but because new markets need new products.

Of course this page only tells one half of the story – the hardware story – because it does not mention the software component of the computer. For this we have to wait until some more actors come to the fore.