Although computers are still a relatively new invention, they have come to be the most radical concepts created after the Industrial Revolution. Although computing has its basis in ancient methods of calculation, such as abacuses and poorly understood artefacts such as the Antikythera Mechanism, what we consider to be a true computer was only developed during World War II, independently by both Germany and the UK.
In Germany, Konrad Zuse is considered the inventor of the modern computer, building the Z3, primarily used for aircraft R&D. Meanwhile, British and Polish codebreakers, having managed to break the ‘unbreakable’ Enigma encryption machine, had built Colossus, designed to unencrypt the Lorenz cipher. Neither of these machines were fast or compact, with computers of the day needing an entire room to themselves, and used vacuum tubes instead of the printed electronic circuits we use today, but the Colossus’s role in unencrypting German signals led to a quicker and less bloody end to World War II.
Among the most notable computer scientists of the day was Alan Turing, responsible for codifying what a computer really was with the Turing Machine, designed ACE at the National Physical Laboratory and coined the Turing Test, or the ability of a computer to mimic the mannerisms and behaviour of a human being in a way that can trick humans. A computer’s ability to mimic human behaviour remains the core tenet of a true Artificial Intelligence (AI) and has influenced science fiction for decades, like HAL 9000 from 2001: A Space Odyssey and Skynet from the Terminator franchise.
With the dawn of the Cold War, computers were suddenly one of the biggest focuses of military and government research, with computers finding their way into all manner of military vehicles, rockets, civilian airliners, intelligence agencies, educational facilities and eventually the home. Computers played a central role in the Space Race, with the United States and the Soviet Union competing to project their military and scientific might into the stars and beyond. Without computers, it’s likely that rocket technology and space exploration wouldn’t have passed Germany’s experimentation with guided ballistic missiles such as the V-2.
Although the usage of computers during the Cold War was primarily in the service of a never-ending arms race between two implacable foes, computers eventually became more common in civilian use. The first ‘personal computers’ were certainly different to what we consider to be a computer today, with the computer usually residing in the keyboard, no mouse, no GUI and only the most basic facilities available, but further progress in technology made PCs faster, cheaper, more accessible, reliable and fun.
The 80s saw the computer break out from science labs into the homes of civilians, with Commodore, Sinclair, Atari and Apple bringing out affordable and understandable machines that could be used to fill out spreadsheets, play video games, create documents and so on. With the long-awaited arrival of computers for the civilian market came rapid performance upgrades and enhancements. The evolution of computer graphics can be seen as an indication of how far we’ve come, with the vector graphics of 1979’s Asteroids looking rather quaint next to 1993’s 2.5D Doom, in turn humbled by today’s fully 3D, almost photorealistic recreations of cities seen in the Grand Theft Auto series.
To put things into perspective, the cutting-edge supercomputer built by NASA to put Apollo 11 on the Moon has the same capacity for processing and calculation as the smartphone in your pocket today. Computers are in our methods of transportation, our kitchen appliances, our TVs and phones, and maybe someday physically part of humanity.