The history of computers | Sunday Observer

The history of computers

4 April, 2021
Vannevar Bush with his Differential Analyzer
Vannevar Bush with his Differential Analyzer

Most of you might be wondering how the computer we use today developed into such a complex machine. Well, today let’s find out about the history of this marvellous creation.

Computer precursors

To find out about the origin of the first calculating device, we must go up to nearly three thousand years back in time, to the flourishing Babylonian Empire. The earliest “Abacus” was a board or slab on which the Babylonians spread sand in order to trace letters for writing.

The abacus developed throughout the years into a device that can be used to do complex calculations. The form of this device which is most common today has beads strung on wires. Since the beads represent either 1 or 0, the abacus can be defined as a digital device.

In 1614, John Napier invented another ground-breaking innovation in calculations, known as “Logarithms”. If the name sounds familiar, it’s because this Scotsman was the inventor of Napier’s bones.

Napier invented this mathematical system to transform a multiplication problem into an addition problem. By 1624, tables with 14 significant digits were available for the logarithms of numbers from 1 to 20,000, and scientists quickly adopted this new labour-saving tool for tedious calculations.

But it was over two centuries later when Charles Babbage, an English mathematical genius and the inventor of the computer, began to put all these discoveries together and attempted to make the first precursor of the Modern Computer.

The humble beginnings of the computer

Charles Babbage had the idea of creating the world’s first calculator to calculate more than 6 digits, in 1822. He sent a letter to the president of the Royal Society of Science, Sir Humphry Davy, about his idea to automate the creation of logarithms. “This could ensure the tables’ accuracy”, he argued in the letter. This would have been a life-saving decision for sailors as logarithms helped them to locate other ships and land, during the 19th Century.

The feature of the Difference Engine that made it so unique when compared to other calculators, was that it had storage like a modern PC; it was designed to stamp its output onto metal.

But unfortunately, Babbage ran into problems while developing the machine. The Government’s funding often ran out and Babbage was working near the tolerances of the technology in the 1800s. The project ground to a halt when Joseph Clement, the machinist responsible for building the Engine, refused to work unless he was prepaid.

But by the time funding for the Difference Engine had run out in 1833, Babbage had conceived something far more revolutionary: A general-purpose computer called the Analytical Engine that could perform any calculation set before it.

No-one before Babbage had ever even thought of such a device, let alone attempted to build one.  

The components of the Analytical Engine are the essential components of every computer today. The mill was the calculating unit, equivalent to the CPU; the store was where data was held prior to processing, similar to memory and storage in today’s computers and the reader and printer were the input and output devices.

The Analytical Engine would have been a real computer, had Babbage not run into problems again. Babbage’s failure to generate the promised mathematical tables with his Difference Engine had dampened enthusiasm for further the Government funding. In 1930, about a century later, an engineer named Vannevar Bush at the Massachusetts Institute of Technology (MIT) developed the first modern analogue computer.

The Differential Analyzer, as he called it, was an analogue calculator that could be used to solve differential equations, a type of problem in physics and engineering applications.

While Bush was working on analogue computing at MIT, across town Harvard Professor Howard Aiken was working with digital calculators. Starting in 1937, he laid out detailed plans for a series of four calculating machines of increasing sophistication and various technologies, from the largely mechanical Mark I to the electronic Mark IV.

The business machines of the time used plug boards (telephone switch boards) to route data manually and Aiken chose not to use them. This made his machine much easier to program than the ENIAC computer, which had to be manually rewired for each program.

From 1939 to 1944, Aiken developed the Harvard Mark I with IBM. The machine was more than 15 metres long, weighing five tons, and consisting of about 750,000 parts. For making this machine, Aiken is credited with developing the first fully automatic large-scale calculator.

All these devices have, in one way or another, paved the way for the invention of the first modern computer; the ENIAC. But since we’ve been covering the History of Computers, I think it’s time for me to put my pen down, or rather step away from the PC, and say goodbye to all of you enthusiastic computer nerds out there.


Dinara Hettiarachchi

Grade 8

Ananda College

Colombo 10