History of Computation | Dr. Boaz Tamir
Present day questions in computer science can be best understood on historical grounds. New models of computation like quantum computers or D.N.A. computers implement old ideas about computing apparatuses in new physics and genetics. We therefore follow the history of computers from early stages, as far back as the 16th and 17th centuries, up to present day, following the evolution of the notion of a 'computer'.
The invention of the first digital computer is attributed to Charles Babbage. We dwell on the 19th century sociological background driving Babbage to his inventions: the 'differential engine' and the 'analytical engine'. The failure to construct those 'engines' ends this short digital episode, and is followed by the rise of the analog computer. Analog machines compute by simulating physical phenomena, which is very close to most modern theories of computation.
New ideas in mathematics at the beginning of the 20th century give rise to the best known artificial model of the classical digital computer: the 'Turing machine'. In the midst of the 20th century, the digital computer is re-invented. We discuss the benefits of digitalization and the reasons for its comeback. In the shadows of world war II, Von Neumann presents his 'computing architecture' which is a modern version of Babbage's old analytic engine. Von Neumann is mainly occupied by the possible use of such a machine in physics.