Skip to main content
A 2.7GHz processor, 16GB of RAM, and a 500GB solid state drive that is almost out of space. Quite frankly, the laptop that I am currently working on is getting a bit old. However, thinking back to 1985, and my newly acquired Amstrad CPC6128, I was pretty thrilled with the 4MHz processor, 128K of memory, and a 180K floppy drive. Times, quite clearly, have changed.

It feels almost clichéd to talk about how IT has advanced, but it is nonetheless revealing to look at exactly how far we have come. Indeed, when we look at specifications such as those above, it’s generally a story of numbers getting bigger – much bigger. The central processors handle more bits and have higher clock speeds; network communications are measured in ever more bits per second; systems have more memory and bigger storage; graphics are characterised by more pixels and more colours; and sound chips have evolved to offer more channels over more octaves.

Of these items, it is the CPU specification that has generally come to be regarded as the most fundamental measure of computing power. This in turn is fuelled by technology following what has become known as Moore’s Law – a 1965 projection from Intel co-founder Gordon Moore suggesting that the number of transistors on an integrated circuit would double every year (a figure that he later revised to every two years in 1975). Although the rate is now considered to be slowing, his prediction has essentially held true for the last 50 years, and this in turn has basically meant that computing technology can be considered to have continually doubled in power (or today’s power has halved in price) every couple of years.

This is illustrated by the plot in Figure 1, showing the increasing transistor counts for a selection of Intel processors over the last 40 years or so. The number has risen dramatically, from 2,300 in their first commercially available processor, the 4-bit 4004, through to 5.56 billion in the current Xeon Haswell-E5.
 

Figure 1. Increasing transistor counts for Intel processors.
 

At the same time, the physical size of the devices, and perhaps more notably the costs, have decreased. In terms of size, the billions of transistors in Intel’s Xeon processor would sit comfortably in the palm of your hand. Meanwhile, the racks required to house the 1,600 valves (the precursor of transistors) in the first electronic computer, 1943’s Colossus, comfortably filled a large room.

Cost has tumbled even further, and we don’t have to look too far back to reach a point at which today’s power simply wasn’t available, no matter what you were prepared to pay for it. Meanwhile, even at a relatively early stage, Atari’s late 80s marketing slogan – “Power without the price” – was essentially capitalising on what Moore’s law had delivered.

In parallel with advancing specifications, IT is continually enabling us to do fundamentally new things, rather than just doing the same things better (which tends to differentiate it from the advancements we see in other technologies such as televisions or cars). We take for granted that today’s computers can play CD quality music, and handle broadcast quality video, but if we look back to the micros of the early 80s, we were lucky to get more than a few beeps out of the some of them, and there was no chance of playing video. Indeed, this was an era when many systems were still actively boasting about the ability to display colour graphics at all (with the baseline PC graphics standard of the era, CGA, typically offering a grand total of 4 colours on the screen at any time, from an overall palette of 16). Meanwhile, the first Apple Macintosh, despite its many other innovations when launched in 1984, was resolutely monochrome (and indeed remained that way until the Macintosh II in 1987).

The Mac is actually an interesting case, as it is a rare (and perhaps unique) example of the same range of computers still being sold over 30 years later. However, aside from the name, almost everything else has changed – as illustrated by Table 1, which compares the specifications of the original Mac from January 1984 with a baseline Retina display model from December 2015. It should be noted that in addition to the considerable increase in the listed specifications, the modern system also has a multitasking operating system, wireless networking, a webcam, and various other features that had no equivalent on the original system. And yet the bottom line is that the price today is $1,000 less than it was in 1984 (and indeed, if the original price is adjusted for inflation, it comes out at almost $6,000 in today’s money).
 

System Specification Apple Macintosh (1984) Apple iMac (2015)
CPU Processor 68000 Quad-core Intel Core i5
Speed 7.8336 MHz 3.1GHz
RAM 128K 8GB
Storage Type 3.5” floppy disk Hard disk
Capacity 400K 1TB
Display Resolution 512×342, display 4096×2304, millions of colours, display
Colours 2 (black and white) Millions
Size 9” 21.5”
Price $2,495 $1,499

Table 1. Comparing the specifications of Apple Macintosh systems over 30 years apart.
 

The increasing specifications have fundamentally altered the things we take for granted and regard as normal. For example, we now routinely email documents that would exceed the entire memory capacity of high-spec PCs from 15-20 years earlier. Illustrating a similar point, Figure 2 shows a photograph of a 20MB IBM hard disk from the mid-1980s that we have in our retro computing collection. Aside from the fact that this capacity sits in stark contrast to the multi-terabyte drives we see today, one can also make the somewhat ironic observation that the drive would not actually have enough space to hold a high-resolution version of the same photo.
 

Figure 2. IBM hard disk drive with 20MB capacity.
 

Staying with the topic of storage, people today often turn their noses up at 1GB USB sticks, regarding them as too small to store very much useful content. However, it has over 61,000 times the memory capacity of the original IBM PC from August 1981, and over 6,100 times the capacity of the single-sided 160K disks that the same systems used for storage. 

Having said all of this, the continual advancements of IT do not necessary mean that we are living with abundance and have capacity to spare. Today’s applications continually demand more power, and while manufacturers continually present ever more dizzying figures, and claim that their technology works at blistering speeds, these increases are often masked by the what the software now requires in order to run at an acceptable pace (as anyone who has tried to run the latest operating systems and applications on minimum spec kit would likely attest!).

Similarly, just because we have more capacity, it doesn’t always mean that we use the capacity wisely or efficiently. As a simple example, just placing the text of this article (ignoring the graphics and table) into a new Word document served to create a 129K file versus just 8K when saved in plaintext format. Such overheads also push the need for other technologies (e.g. memory, storage and network communications) to improve in order to keep up with the demands of the ever-larger media that we expect to be using.

Of course, while they make for potentially interesting observations, none of the later points really represent any cause for concern. We would not be able to drift into such practices if the technology hadn’t evolved to allow it, and we can rely upon further advancements to ensure that technology continues to keep pace. Indeed, even if Moore’s Law slips a bit, it still seems likely that technological revolution will remain the basis for business as usual in the world of IT.

  • Prof. Steven Furnell is head of the School of Computing, Electronics and Mathematics at Plymouth University, and co-curates the South West Retro Computing Archive
  • For more on the evolution of computing power, read Dr Fisher's October 2014 column, 'Tools of the trade'.

Leave a Reply

Significance Magazine