Lesson Progress:

History of Technology and Computing

Lesson Content

Inquire: Where did Our Computers Come From?

Overview

From the first computers of the 1950s to the tablets and smartphones that seem to dominate so much of our lives, computers have evolved and changed not only their form factors but our very culture. It is important to know where the technology has come from to think about where it might go.

When you have completed this lesson, you will be able to describe the basic history of computers from the 1950s until modern day. You will also understand the impact computing has had on society.

Decorative

Big Question

Look at your smartphone or tablet; where did the technology that powers it come from? Where did it all begin?

Watch: The Evolution of Computers

Read: The History of Computers

Overview

DecorativeThe full history of information technology as we know it — the managing and processing of data with the use of a machine and software — dates to the early 1940s. One of the first programmable digital devices was the decryption machine Colossus developed by the British during World War II. The honor of the first general use digital computer was the Electronic Numerical Integrator and Computer (ENIAC). It would be several more years before the Universal Automatic Computer (UNIVAC) appeared as the first computer capable of keeping programs stored and run from memory in 1950. The technology began to evolve quickly from that point starting with analog systems that filled entire warehouses, to the pocket sized-devices we take for granted today.

The Mainframe Era

Throughout the 1950s and 1960s, the mainframe was king. Starting with systems built using hundreds of vacuum tubes, these massive computers could fill entire rooms, or even buildings. The size and cost meant they existed only in the hands of the largest businesses, universities, or government agencies. Punch cards and punch tape ran the computers and required specialized teams to maintain and operate them to conduct data entry. Once data was entered, all a user could do was await the result of the system calculations, sometimes taking hours depending on the complexity of the program.

By the mid-1960s, so-called mini mainframe computers had entered the market. While still massive by today’s standards, it was possible to fit several of these machines in a single room, and parallel processing allowed them to complete their computations significantly faster than their predecessors. By the late 1960s, advances in hardware, thanks to transistors, had increased the efficiency of computers and lowered their prices to a point where many companies who before had been unable to afford them began to buy the systems to handle things such as inventory tracking, invoicing, and other basic business needs.

DecorativeThe decrease in size continued, and the late 1960s saw the release of the first mass market desktop computer by Hewlett Packard, though it was not what we think of as a desktop today. The HP 9100A was more or less a scientific calculator. However, its smaller form factor and the convenience of many of these desktop systems would make way for the true future of computing in the form of the personal computer.

The first personal computers began to appear in the mid-1970s, though like Hewlett Packard’s desktop, these were more like modern calculators. It wasn’t until the early 1980s with the appearance of the Apple II and IBM PC that personal computers as we now know them began. This was followed shortly after by PC clones from companies such as Compaq, Hewlett Packard, and Dell, all built to be compatible with the IBM system. This was the birth of modern computing.

The Rise of the Microprocessor

DecorativeThese small form factor systems, in comparison to their mainframe predecessors, were only possible because of the invention of the microprocessor in the early 1970s by Intel. A microprocessor is an integrated circuit that contains all the functions of a central processing unit of a computer. Thanks to the use of transistors, the components of the computer’s central processing unit were able to shrink down to a single chip, or in some systems, two or three chips. This not only allowed the computers themselves to become smaller, but also sped up the processing of data. In older systems, several circuit boards would do what a single chip can do now. This also made computer functionality expandable to other devices, though it would be another few decades before this became common.

For most of the 1980s, the personal computer remained the same. The systems became faster, storage and working memory grew, and graphical capabilities increased, but the core design remained the same. The microprocessors allowed for more and more miniaturization which allowed the systems to become more complex while remaining in essentially the same form factor. This would begin to change with the birth of the World Wide Web and the networking of the world.

The World Wide Web

DecorativeWhile the Internet itself had existed since 1969, it was confined to the use of universities and the government and the very specific operating systems used by various mainframes and microcomputers. However, businesses desiring to take greater advantage of their computer systems to send data between locations and business partners began hooking into the network in the 1980s. The proliferation of the IBM Desktop and its MS DOS operating system made these connections easier.

DecorativeThen, in 1989, a researcher at CERN Laboratories named Tim Berner-Lee developed the HyperText Transfer Protocol (HTTP) and the HyperText Markup Language (HTML), along with the first Internet browser designed to use them. This development would unlock the network to anyone with a computer and a modem. This led the way for the creation of the World Wide Web by interconnecting the world we now live in.

From this foundation grew instant messaging, voice over internet protocol (VoIP), social networking, e-commerce, and more. This technology led to the creation of another area of digital computing: the mobile device.

Computers Go Mobile

While there had been mobile computers as early as the late 1970s, these were 50 to 60 pound boxes with tiny five inch screens and limited processing power. By the late ‘80s, lighter laptop computers had become more prevalent. But, it wasn’t until the advent of the World Wide Web that such computers came into popular use. That popularity grew throughout the ‘90s until the mobile landscape changed with the release of tablets and smartphones in the 2000s. While the first few releases from companies like Microsoft did not fare well, by 2007, Apple released the iPhone, and followed it with the iPad in 2010. Other companies followed with similar products, creating a completely new way for people to interact with the Internet and the power of computers.

What Does the Future Hold?

As technology continues to advance, we will continue to find computers in more and more products. Already, smart technology devices like Google Nest are allowing us to control our homes from our phones. Today, everything connects to the Internet, from refrigerators to coffee makers. Technologies like self-driving cars are pushing the limits of computer technology.

Reflect: What is Your Device of Choice?

Poll

What kind of computer device do you use the most on a daily basis?

Expand: The Limits of Computer Advancement

Discover

DecorativeSince the 1950s, computers have gotten smaller and smaller. Starting as massive connected machines filling entire buildings, they shrank to fill a single floor, and then a room, and a desk. Today, computers hundreds of times more powerful than the massive mainframes of yesterday can fit in a pocket and access information from all over the world with the swipe of a finger.

The rate of this increase in speed and miniaturization was predicted at the same time as the advent of the semiconductor and microprocessor. A paper was written in 1965 presenting what has come to be known as Moore’s law.

Moore’s Law

Moore’s law is the observation that the number of transistors in a dense integrated circuit doubles about every two years. There are some misconceptions related to Moore’s Law. There are those who think that it sets some theoretical limit on the number of components per integrated circuit that could exist on a semiconductor; this is not true. Rather, what Moore’s Law did was set a forecast that stated that the number of components per integrated circuit would double each year and would in turn result in an equal increase in processor performance. He later revised this in 1975 to state that this doubling would happen every two years as a result of a slight slowdown in development speed his prediction held true until 2012.

It is important to keep in mind, however, that Moore’s Law is not binding in any way. It was a prediction based on Moore’s observations of the semiconductor industry and the rate at which they were first being manufactured and then revised a decade later based on ten years of observations. It has become something of a self-fulfilling prophecy as the industry uses it as a benchmark for their development and release of new chips.

To this point, in 2015, Intel showed a slowdown to roughly two and a half years. And at the time did not expect to double again until 2018, making it a three-year period. Moore himself at the time stated that he felt Moore’s law would be dead within the next decade because the advent of new materials technology was likely to change the nature of semiconductors in a way that made Moore’s law obsolete.

Rock’s Law

Moore’s law is sometimes paired with Moore’s second law, Rock’s law. Like Moore’s law, this was a prediction related to the manufacturing of semiconductors. Rock’s law says that the cost to produce semiconductors would double every four years.

This was due to the ever-increasing cost for research and development of new technologies with which to shrink the size of semiconductors, while at the same time, increasing the number of transistors to increase performance. This requires factories to upgrade their equipment to keep up with the ever-changing technology. At the same time, cost to users for the chips would continue to edge downward.

This held true up until the late 1990s when the costs plateaued. However, it is this high cost, more than some technological limitation, which is most likely to limit the size and performance of semiconductors in the future.

The Future

While the predictions of Moore’s law may not be true any longer, technology does continue to advance. The development of new materials and methods for miniaturization continue to increase the number of components per integrated circuit and increase performance. While there may be a theoretical limit at some point in the future, we are not there yet.

Check Your Knowledge

Use the quiz below to check your understanding of this lesson’s content. You can take this quiz as many times as you like. Once you are finished taking the quiz, click on the “View questions” button to review the correct answers.

Lesson Resources

Lesson Toolbox

Additional Resources and Readings

The Birth of the Web

A website providing information on the history of the WWW with a live reproduction of the first website

History of Computers: A Brief Timeline

A website showing a timeline of computer history starting with analog computers in the 1800s up until modern day

Timeline of Computer History

A timeline from the Computer History Museum showing the evolution of computer technology from the 1930s until today

Lesson Glossary

Terms

AJAX progress indicator
  • microprocessor
    an integrated circuit that contains all the functions of a central processing unit of a computer
  • Moore's law
    the observation that the number of transistors in a dense integrated circuit doubles about every two years
  • Rock’s law
    the cost to produce semiconductors would double every four years

License and Citations

Content License

Lesson Content:

Authored and curated by David Thomas for The TEL Library. CC BY NC SA 4.0

Media Sources

 LinkAuthorPublisherLicense
DecorativeOsborne1NtouranWikimedia CommonsPublic Domain
DecorativeTim Berners Lee’s NeXT CubeVictor R. RuizFlickrCC BY 2.0
DecorativeTim Berners-Lee CPSilvio TanakaWikimedia CommonsCC BY 2.0
DecorativeANS700-CPU-boardHenrik WannhedenWikimedia CommonsPublic Domain
DecorativeHP 9100A Calculator 1968 frontSwtpc6800Wikimedia CommonsPublic Domain
DecorativeEniacUnknownWikimedia CommonsPublic Domain
DecorativeSmartphone Cellphone Apple IPhonestevepbPixabayCC 0