The History of Computing in the History of Technology

Chanda Monga, Minakshi Sharma, Etti Sharma


In the standard story, the computer’s growth has been rapid and short. It  starts  with the giant machines warehoused in World War II–era laboratories. Microchips shrink them onto desktops, Moore’s Law predicts how powerful they will become, and Microsoft capitalizes on the software. finally small, cheap devices come out that can trade stocks and beam video around the world. That is one way to approach the history of computing—the history of solid-state electronics in the past 60 years. But computing existed long before the transistor. Ancient astronomers developed ways to predict the motion of the heavenly bodies. The Greeks deduced the shape and size of Earth. Taxes were summed; distances mapped. Always, though, computing was a human pursuit. It was arithmetic, a skill like reading or writing that helped a person make sense of the world. The age of computing sprang from the abandonment of this limitation. Cash registers came first to organize mathematical computations using what we now call “programs.” The idea of a program first arose in the 1830s, a century before what we traditionally think of as the birth of the computer. Later on, the modern electronic computers came out. These electronic computers were capable of doing any kind of information processing and also the manipulation of its own programs. These are the computers that power our world today. This paper discusses some of the major issues addressed by recent work in the history of technology. It suggests the different aspects of the development of computing. These are relevant to those issues for which that recent work could provide models of historical analysis.

Full Text:


Copyright (c) 2018 Edupedia Publications Pvt Ltd

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.


All published Articles are Open Access at 

Paper submission: