How Long Have Computers Been Around?

Thanks to today’s technology, people are able to enjoy an easier and more comfortable way of life. Communication despite distance is a breeze, work and research can be completed easier, and information can be accessed by everyone, anywhere as long as they have access to the internet and use a computer or mobile device.

But have you ever wondered where computers originated or how long they have been around?

How long have computers been around? If you count mechanical computers, computers have been around for about 200 years. If you count just the types of personal home computers we use today, these computers have been around for about 50 years.

The first mechanical computer was created by Englishman Charles Babbage in 1822 (about 200 years ago). The first personal computer similar to the ones we use today was introduced in 1971 (about 50 years ago), called the Kenbak-1, by American John Blankenbaker. The first time the term “personal computer” was used, was in 1975 when the Altair 8800 was introduced.

There’s much more to the history of computer you might find fascinating. It took about 150 years to go from the mechanical computer invention of 1822 to the first “personal computer” of 1971. From 1971 to today, in only about 50 years, the advancements in computer technology have climbed incredibly fast.

Keep reading to learn more about how long computers have been around, where they originated from, and the future of computers.

The Term “Computer”

The first time the word “computer” was recorded was in 1613 – and used to describe a person who did computations or calculations. From 1613 to the 19th century, the definition of the term computer remained the same, until the machines whose main function was to perform and complete these computations were invented. Today, we know computers as an electronic device that stores and processes data based on what the user instructs it to do.

The First Mechanical Computer

how long have personal computers been around
Charles Babbage’s 1822 Difference Engine

It was 1822 when Charles Babbage, an English polymath, began developing the Difference Engine. This device was able to compute several sets of numbers and make physical copies of the results. The Difference Engine was considered to be the first automatic computing machine. Ada Lovelace, regarded as the first computer programmer, helped Babbage with the development of the machine. Sadly, because of funding, a full-scale functioning model was never made.

In 1837 Babbage introduced the first general mechanical computer, which he called the Analytical Engine. This device featured an Arithmetic Logic Unit, a complex digital circuit responsible for performing mathematical operations on binary numbers. Inspired by the Jacquard Loom, the Analytical Engine also had punch cards and had an integrated memory. Again, due to funding, this device was never created.

The Z1 by Konrad Zuse

Between 1936 and 1938, the Z1 was born. This device is considered to be the first truly functional modern computer. The Z1 was originally named V1 by its developer Konrad Zuse, who built this machine in his parent’s living room. This device was a binary programmable computer that had a 64-word memory and made use of a punch tape for both its programming and output functions.

The Turing Machine by Alan Turing

The Turing Machine was proposed in 1936 by Alan Turing, an English mathematician, logician, and computer scientist regarded as the father of computer science. This machine paved the way for further theories about computers and computing. The Turing Machine printed symbols on paper, similar to how a person would if they were following a set of logical instructions.

The Colossus by Tommy Flowers

Due to the need of the British to read encrypted German messages, the Colossus was born. This device was developed by Tommy Flowers and was first demonstrated in 1943. This device was very valuable during World War II as it helped the British gain valuable intelligence information during the war. The Colossus is also regarded as the first programmable computer.

The ABC by Professor John Vincent Atanasoff

The ABC is short for Atanasoff-Berry Computer, which was a device developed by Professor John Vincent Atanasoff and Cliff Berry in 1937 to 1942. This device was an electrical computer that made use of over 300 vacuum tubes for digital computing, but it had no CPU which meant it was not programmable. 

The ENIAC

The ENIAC was a device developed and built by J. Presper Eckert and John Mauchly between 1943 – 1946. The device was huge – 1,800 square feet, weighed 50 tons, and made use of around 18,000 vacuum tubes. The ENIAC is considered by a lot of people to be the first digital computer since it was completely functional.

Stored Program Computers

As inventors continued their pursuit of creating and improving the computer, in 1948 the SSEM was born. This device was able to electronically store and process a program. SSEM is short for Small-Scale Experimental Machine. This was designed by Frederic Williams, built by Tom Kilburn and Geoff Tootill.

A year later came the British EDSAC – the Electronic Delay Storage Automatic Calculator. This device was built and designed by Maurice Wilkes and even had a version of the tic-tac-toe game built in it.

It was also during the same time that the Manchester Mark 1 was born; it was also another device that was able to run stored programs. From there, inventors and mathematicians continued to build and improve on the devices, allowing it to read through a set of words that were stored in the computer memory. Over the years, the first computer company was established, the first commercial computer was built, IBM came around, and computers began to be equipped with RAM and graphics.

How Computers Evolved from the 50s to the 70s

history of computers
The first personal computer is considered to be John Blankenbaker’s Kenbak-1, introduced in 1971

There was some major advancement in computer technology from the 1950s to the 1970s. I’ll now list some of the more important advancements during this period.

IBM introduced its first commercial scientific computer in 1953.

The first computer with RAM, called the Whirlwind machine, was created by MIT, in 1955.

The first transistor computer, the TX-0, was introduced in 1956 by MIT.

By 1960, finally, the size of those massive computers was decreased, as the Digital Equipment Corporation, released the minicomputer, the PDP-1.

The first desktop computer is considered to be the Programma 101, which was introduced in 1968. It also became the first mass-marketed desktop computer.

The first workstation, the Xerox Alto, was released in 1974. The workstation included the computer, display, and mouse, which was revolutionary at the time. It also included icons, windows, and menus.

The first microprocessor was released by Intel, called the Intel 4004, in 1971.

The first personal computer is considered to be John Blankenbaker’s Kenbak-1, introduced in 1971. It came with a price of $750 (about $5,000 in today’s dollars). It had different switches you could turn on and off to input data.

In 1973, the first microcomputer, called the Micral computer, was released. It was the first non-assembly commercial computer and cost at the time $1,750 (about $10,000 in today’s dollars).

In 1975, Ed Roberts released the Altair 8800. He was the first to coin the term “personal computer”.

Home Computers

Home computers grew in popularity by the end of the 70s. In 1977, it was marketed and sold as an affordable computer meant to be used by a single user who was not on the technical side. These home computers cost less than business ones but were also less powerful memory-wise.

Home computers were used mainly for playing games, word processing, and completing homework. Home computers were also known for better sound and graphics than their more advanced business counterparts.

Home computers around that time were awfully similar to what we use today – they came with a keyboard built into the case of the motherboard and came with expansion ports. Most home computer units came with the CPU/keyboard, a floppy disk drive, and a color monitor. Some home computer systems also had a dot matrix printer, and that was basically it in terms of expansions.

Notable Home Computers

In the USA, the most popular home computers up to 1985 included:

  • TRS-80 in 1977
  • Different models of the Apple II family introduced in 1977
  • Atari 400/800 in 1979, as well as follow up models: 800XL and 130XE
  • Commodore VIC-20 in 1980
  • Commodore 64 in 1982

Even though personal computers at this time were viewed as an electric replacement to typewriters, and basically a gaming console, it is believed that there were around a million personal computers in the United States during this time.

Home computers were marketed as the technology that would change the world, and everyone wanted to be part of it. Decades later, the computer did deliver on its promise, and it did change the world we live in.

The Evolution of Computers

when were computers invented
Apple’s first computer, the 1976 Apple-1

Looking back, it’s hard to remember what life was like without desktop computers, laptops, tablets, and mobile phones. These devices play a huge part now, in our everyday lives – so much so that we use them every single day, often from waking in the morning to nighttime before bed. Modern computers have had the greatest effect on society, affecting almost every aspect of our lives.

First-Generation Computers

First-generation computers looked and functioned very far from the computers we have today. These first-generation computers were invented from both 1940 – 1956, and they were enormous in size and extremely heavy. They made use of magnetic drums and vacuum tubes and were generally unsophisticated.

These computers also produced a large amount of heat, causing regular incidents of overheating to occur. They also made use of machine language, which was a very basic programming language.

Second-Generation Computers

The second-generation computers were designed and developed between 1956 to 1963. Transistors replaced the vacuum tubes which meant that the computers used less electricity, and also made less heat.

These computers were also faster and smaller than the first generation ones. In addition to magnetic storage, the second generation computers also saw the development of core memory.

Third-Generation Computers

The third-generation computers saw a change in speed, semiconductor chips, keyboards and monitors from 1964 – 1971. The third-generation computers were smaller, more powerful, but were less expensive.

These machines allowed their users to interact via the keyboard and monitor. It also did away with the punch cards and the printouts of previous systems.

Fourth-Generation Computers

Fourth-generation computers were from 1971 to 2010, and saw the greatest changes in computers as we know it. Technology developed to a point where it allowed millions of transistors to be placed on a single circuit chip, bringing the monolithic integrated circuit technology.

This is also the time where the Intel 4004 chip was invented, the first microprocessor chip to be commercially available in 1971. The fourth-generation computer saw the advent of the personal computer industry.

The mid-70s also saw personal computers like the Altair 8800 being sold to the public in kits that needed assembly. Later on, computers that were already assembled like the Commodore Pet and Apple II were also available. 

Personal computers had the ability to create networks, and this was what lead to the internet in the 1990s. Laptops and hand-held devices were also from the fourth-generation computers. This was also the time when computers went through major changes in storage capacity and processing speed.

Fifth-Generation Computers

The fifth-generation computers include what we will see in the future. This should include even faster and more advanced computer technology. Although there’s nothing set or defined when it comes to fifth-generation computers, we should see developments in nanotechnology, artificial intelligence, and other advancements as technology is growing in multiple paths towards computer development.

The Future of Computers

Computers in the future are not just the rectangular objects we hold in our hands. They already are and will be in almost everything we touch – our cars, our refrigerators, even our light switches. Soon, all of our devices will be able to communicate with each other.

Imagine your phone controlling all the lights and electricity in your home via an app, even while you’re miles away. It’s possible that one day, we may be able to experience the kind of technology that we only see in sci-fi movies.

In fact, we already do have “smart bulbs” like the Philips Hue White and Color Ambiance Bulbs (click to see price and specs on Amazon). These bulbs can be controlled by voice control, syncing with Amazon Alexa or Google Home Assist. They can change to 16 million colors and shades of white with just simple commands.

Or what about the Google T3007ES Nest Learning Smart Thermostat (click to see price and specs on Amazon) which learns the temperatures you like and programs itself, varies the temperature automatically by the weather outside, turns itself down automatically when you’re away, and of course has voice control.

And let’s not forget the steady rise in virtual reality. The Oculus Quest All-in-one VR Gaming Headset (click to see the specs and price on Amazon) is becoming more and more popular among gamers mostly. But there is also a growing market for non-gamers who want to experience virtual reality in the comfort of the home.

Since technology affects our lives and the world we live in, we should remember that these changes should be available for everyone, not just those who can afford them. For example, 3D printing, and prosthetics with robot technology – these should be accessible and available to everyone and anyone who needs it, regardless of their income. 

Scientists and technologists should work towards achieving changes and advances that will benefit everyone. They should be trained to see how their designs and inventions could be used technologically, but should also be made aware of how these will impact people and lives in general.

So what will the future bring? No one knows. But we can only hope that whatever gadget or other technological invention is made, it would work towards creating a better bond between people.

These innovations should help create and strengthen this bond, and allow nations and people to better understand one another through these innovations.

Conclusion – How Long Have Computers Been Around?

Computers have touched and influenced our lives and the whole world in more ways than one. Work, communication, and access to information is much easier thanks to computers and the internet. But it wasn’t always this way as there was a time when there were no computers yet.

How long have computers been around? There are different classifications of computers, so the answer can vary on the type. In 1975, the Altair 8800 was introduced, and this was also the first time the term “personal computer” was used.

The KENBAK-1, introduced in 1971, is also considered as the first personal computer to be launched. So one could argue, computers, like the ones we use today, have been around for about 50 years, since the early 70’s.

Even with the many changes, the computer has gone through, from being a simple computing machine to the computers we enjoy today, one thing is for certain – the future will bring us even more technological advancements that will greatly impact our lives and the world we live in.