-Karthik Gurumurthy
It’s wild to think how we got from mathematicians writing theoretical papers to the user-friendly software we take for granted today.
It all started with a brilliant British mathematician named Alan Turing. In 1937, as a 25-year-old graduate student at Princeton, he wrote a paper called “On Computable Numbers” that described a theoretical computer that could follow instructions encoded as a grid of black-and-white squares on paper. Turing basically concluded that computers should be able to perform any calculation by following simple steps – a revolutionary idea when computers were barely even a concept!
The person who turned these theories into reality was John von Neumann, a Hungarian-born American mathematician. During WWII, he worked on the Manhattan Project, where scientists were using primitive computers that had to be physically rewired for each new calculation – can you imagine?
In 1946, the ENIAC (Electronic Numerical Integrator and Computer) arrived at the University of Pennsylvania, built for the U.S. Army. This behemoth wasn’t programmable in the modern sense – setting it up to solve a problem meant physically arranging electrical circuits, which could take days! Naturally, they wanted something better.
Von Neumann’s big breakthrough came with the IAS computer in 1951, which established the basic elements we still use today: a program, input/output components, memory, and a central processing unit. His most significant innovation was moving from having to rewire computers to having programs stored in memory.
The earliest programming was done in machine language – just strings of ones and zeros that the computer could understand directly. Talk about tedious! This evolved into assembly language in the early 1950s, created by Grace Murray Hopper for the UNIVAC computer. Instead of binary code, it used symbols like “MPY” for “multiply.”
Then came the higher-level languages that revolutionized programming:
- FORTRAN (1950s): Created by IBM for math and science
- COBOL (1959): Developed by Grace Hopper using entire English words and phrases for business applications
- BASIC (1960s): Created at Dartmouth College as the first language simple enough for non-professionals
- C (1970s): Developed at Bell Labs for telecommunications
- C++ (more efficient version of C)
- Java (1990s): Created by Sun Microsystems for web programming with sound and video
Operating systems evolved alongside programming languages. The first OS for mainframes appeared in 1954, while the first for personal computers was SCP-DOS, developed at Seattle Computer Products in the mid-1970s. When IBM needed an OS for their PCs, they contracted with Microsoft, who renamed it MS-DOS. IBM called it PC-DOS, while most users just called it DOS.
DOS had a text interface requiring memorized commands, which limited its appeal. In 1995, Microsoft Windows with its graphical user interface replaced DOS as the most popular PC operating system. By March 1996, an impressive 30 million copies of Windows 95 had been purchased!
Looking at the progression from physicists rewiring machines to modern software that lets us point and click, it’s amazing how quickly we’ve come from esoteric coding to intuitive interfaces. Even more incredible is how these basic concepts developed by mathematicians and scientists in the 1930s-1950s are still at the heart of the devices we use every day!
Leave a comment