An Abridged History of Computing

Date: April 15, 2018

If memory serves me right, I first saw a computer in 1988, in Grade 9, at my high school in India. I distinctly remember how I and my fellow students were in complete awe of these desktop machines in the school computer lab. Each computer had a fairly large case and a bulky black-and-white cathode ray tube (CRT) monitor connected to it via a thick cable. The footprint of the keyboard connected to the computer case was large too. These computers were dedicated BASIC programming language machines – the language our school wanted us to learn and master. Fig. 1 shows a for-loop in BASIC.

10 FOR J = 1 TO 5
20 PRINT "HELLO WORLD!"
30 NEXT J
RUN

/*
Output:

HELLO WORLD!
HELLO WORLD!
HELLO WORLD!
HELLO WORLD!
HELLO WORLD!
*/

Fig 1. A for-loop in BASIC

Despite my fascination with these new machines, I was unable to fully comprehend or appreciate the value of a computer. I wondered what the BASIC programming language might help us achieve and what could possibly be the significance of this and other computers in our lives. My doubts and apprehensions were perhaps not entirely out of place as computers had not really pervaded the real-world until then.

My next rendezvous with computers happened in 1992, while pursuing my electrical engineering degree in India. The computer lab in the engineering college had several desktop computers running FORTRAN 77. I soon learnt that the heart of these machines was a Digital Equipment Corporation (DEC) VAX mainframe system that occupied a large air-conditioned room somewhere near the computer lab. I found this room only to realize that it was always locked. Through the glass doors and windows, I could see large refrigerator-like units (with flashing lights) stacked side-by-side. Fig. 2 shows a VAX 11/780 machine.


Fig 2. Elaboratore Digital VAX 11/780. Source: Wikipedia

An interesting problem that I tried to solve using FORTRAN 77 was to determine if a given year is a leap year or not. Intuitively, I would just check if the year was evenly divisible by 4 or not and conclude accordingly. This would give a wrong answer on many occasions. Only recently have I learnt that for a year to be a leap year, we have to additionally check whether it is evenly divisible by 100 and 400! Please refer to the correct C code in Fig. 3.

#include <stdio.h>

int main () {
    int year;
    year = 2020; // Input year here

    if (((year%4 == 0) && (year%100 != 0)) || (year%400 == 0))
        printf ("%d is a leap year", year);
    else
        printf ("%d is not a leap year", year);
        printf ("\n");
    return 0;
}

/*
Output:

2020 is a leap year
*/

Fig 3. Leap year program in C

By the mid 1990s, mainframe computers (like the VAX) started being overtaken and replaced by personal computers. An increasing number of people started using a desktop personal computer (PC) both at work and at home. The discovery of the semiconductor silicon – leading to the invention of solid state transistors and programmable integrated circuit (IC) chips (Fig. 4) – played a key role in ushering in the era of the PC. Very soon, despite being relatively more expensive than desktop PCs, laptop PCs also saw a steady rise in production and sales.


Fig 4. Transition from vacuum tubes to programmable IC chips

In my opinion, the two most groundbreaking discoveries in the realm of computers were: (i) The graphical user interface (GUI) that was first seen in the Apple Macintosh 128K in 1984 and (ii) The Internet that by the 1990s provided a worldwide networking infrastructure.

The Merriam-Webster dictionary defines the GUI as “a program that allows a person to work easily with a computer by using a mouse to point to small pictures and other elements on the screen.” Both Apple and Microsoft were instrumental in introducing and enhancing the mouse and pointer technology for PCs. This facilitated the development of user-friendly programs like MS Word, MS Excel, Adobe Photoshop, Apple iTunes, and many more. The GUI technology in conjunction with associated hardware/software provided a rich and colorful user experience as opposed to rather dull and somewhat intimidating command line programming. This technology allowed a variety of individuals, with or without in-depth knowledge of computer science, to easily use computers in their day-to-day lives.

The Internet, defined by the Merriam-Webster dictionary as “a communications system that connects computers and databases all over the world,” was, of course, the icing on the cake! The Internet placed a wealth of information at our fingertips. Books, news, college admissions, jobs, entertainment, etc. were just a mouse click away.

I started my masters and doctoral studies in computer science in the UK in October 2000, shortly after the Y2K bug scare that actually did not create the havoc expected of it. During my education in the UK (2000-2006), I witnessed and learnt how desktop and laptop PCs could be used for performing various kinds of non-trivial computations. Applications as diverse as neural networks, fuzzy logic, Bayesian networks, digital signal processing (DSP), image/video processing, and financial stock market simulations ran successfully on these PCs (Fig. 5). This was the time I was also introduced to grid computing (several interconnected computers), which was generally employed when computational tasks were too intense for a single PC to handle.


Fig. 5. A 2005 HP desktop PC with webcam. Source: http://www.donlsmith.net/computer.htm

My professional journey in Canada began in June 2006, and by early 2007, a relatively new technology called cloud computing witnessed a steep rise in popularity and adoption. Cloud computing is conceptually the same as grid computing but on a much larger scale. It is essentially a new name for web-based grid computing. Cloud computing has witnessed several improvements over the years. Some enhancements that come to mind are faster processors, larger storage capacities, increased software/hardware resources, and improved data sharing and parallel processing protocols. Today, cloud technology is ubiquitous and indispensable, and it is here to stay.

However, what changed computing forever was the release of the first Apple iPhone in June 2007. The most appealing feature of the iPhone was its virtual user interface – the revolutionary touchscreen technology. Though I had used touchscreen technology on CRT monitors while working at the Tata Group (India) in 1997, its augmentation and implementation in a sleek mobile phone in 2007 was simply breathtaking.

The time from 2008 onwards is often referred to as the age of smartphones and tablet PCs. Leveraging the outcomes from nanotechnology research, companies like Apple and Samsung have been able to pack considerable computing power, speed, and features into relatively small form factor smartphones and tablets. These powerful touchscreen devices can accomplish tasks ranging from word processing to signal processing to gaming. With reference to smartphones, we often forget that they can make telephone calls too! Today, these devices are the easiest to handle, carry, and use. Often, children use them with greater proficiency than adults.

The proliferation of smartphones and tablets has caused a meteoric rise in the design and development of innovative mobile Apps. This has fundamentally changed the modus operandi of the software industry. Earlier, software for PCs was mostly created and sold by big companies. Today, individual and private App developers have a solid platform in mobile technologies for showcasing their talent, ideas, and abilities – WhatsApp Messenger (January 2009) and Uber Technologies Inc. (March 2009) are prime examples of pioneering and successful Apps.

The release of the Apple Smartwatch in 2015 created yet another wave of mobile computing that completely changed the world’s outlook towards wearable devices (Fig. 6). Though the smartwatches of today accomplish most tasks by tethering to a smartphone, tablet, or PC, they offer a lot of convenience and functionality to users. Users can accomplish tasks like receiving/making phone calls, browsing pictures, and measuring their heart rate. The future looks bright for smartwatches, and we can expect to see a dramatic increase in their computing power and capabilities.


Fig. 6. Apple iPhone 8 (left) and Apple Watch (right)

A new buzzword called Internet of things (IoT) has become quite popular since 2016. IoT is all about interconnectivity of multiple entities like vehicles, smartphones, and home appliances via existing Internet infrastructures. All objects in the IoT network are “smart, aware, and responsive” in the sense that they are equipped with embedded hardware, software, sensors, and actuators. By sharing information and data across entities, IoT is poised to make invaluable contributions towards automation technologies like smart homes, intelligent transportation systems (ITS), and smart cities.

As of 2018, it seems that computing has come full circle in the last 50 years or so. The first handheld computers were electronic pocket calculators of the 1970s. The first wearable computers were digital LCD wristwatches that became popular around the same time. More powerful, versatile, and gigantic mainframe computing machines like the IBM System/370 and DEC VAX 9000 reigned supreme in the 1980s. Much smaller desktop PCs became popular in the 1990s, followed by yet smaller laptop PCs in the 2000s. Finally, from 2008-2018, computers were further miniaturized to match the size of the original electronic pocket calculators and digital LCD wristwatches of the 1970s. However, the last development cycle (2008-2018) was characterized by an extraordinary increase in processing power as predicted by Moore’s law. Hence, the smartphones, tablets, and smartwatches of today were born.

I see two major challenges for today’s mobile technology, namely, short battery life and high cost. However, these challenges are not deal-breakers in any way. I am hopeful that further improvements in technology, increase in demand, and mass production will successfully tackle these problems and a new sun will rise for mobile computing.

Roughly, the last 50 years have belonged to mainframe, desktop, and laptop computing. The next 50 years will surely belong to mobile technologies underpinned by cloud computing and operating within the IoT framework. We have been lucky to have lived in this era and witnessed all the growth and opportunities that computer science and computing have to offer.

As new technologies like self-driving cars evolve and discoveries like nanomushroom sensors are made, the possibilities are endless. Just stay tuned and enjoy!