How the Modern Computer Came to Be

owl-mascot

Modern computers are the result of the efforts of many physicists and engineers in the last two centuries. The history of the laptop that I write this post on begins in 1782 when the term semiconducting was first used by Alessandro Volta, then a professor of experimental physics at the University of Pavia in Lombardy, Italy.

New Electrical Components

Semiconductors

The 19th century was the great age of experimental physicists. These guys discovered many properties of materials simply by creatively experimenting with them.

One such physicist was Karl Ferdinand Braun who was greatly intrigued by the rectification of metal-semiconductor junctions.

In 1898, Braun invented the cat’s-whisker detector made of galena. This detector served as a rectifier that converts alternating current (AC) to direct current (DC). The device was used in crystal radios up until World War II.

crystal-radio

Source: www.nzeldes.com/Miscellany/Curiosities.htm

The research of semiconductors was furthered by Jagadish Chandra Bose who patented a point-contact rectifier in 1901.

In 1926 Julius Edgar Lilienfeld who patented the concept of a three-electrode amplifying device based on the semiconducting properties of copper sulfide.

In 1939 Russell Ohl discovered the p-n junction which is the boundary between two types of semiconducting material inside a semiconducting crystal.

Ohl also found out that the purification of germanium and silicon led to much more repeatable semiconductor material for use in diodes.

diode

Source: en.wikipedia.org/wiki/Diode

Vacuum Tubes

Before modern transistors were invented, vacuum tubes were the most popular devices to control electric current.

Funnily enough, the first cathode ray tube (CRT) was also created by Karl Ferdinand Braun in 1897. It served as a basis for all TVs and computer monitors before LCDs took over the market.

crt

Source: 3dvision-blog.com/7267-3d-vision-no-longer-supports-generic-crt-monitors-in-stereo-3d/

Braun eventually received his well-deserved Nobel Prize in physics in 1909 which he shared with Guglielmo Marconi for contributions to the development of wireless telegraphy.

During World War II large vacuum tube computers were used by the Allies to decipher messages created by the German Enigma machine.

colossus

Source: wallpoper.com/wallpaper/colossus-world-382336

Thanks to this, they remained popular in the government and corporate sectors during the 1950s.

The Transistor

Lilienfeld’s concept of a unipolar field-effect transistor (FET) was put to practical use only after his patent expired.

In 1947, the Bell Labs team of John Bardeen and Walter H. Brattain managed by William B. Shockley invented the point-contact transistor.

bardeen-brattain-shockley

Source: www.commonsensedirections.org/html/Adjunct%20Pages/

The three of them shared the Nobel Prize in physics in 1956. Due to Shockley’s personality traits, neither Bardeen nor Brattain wanted to work with him on furthering the development of transistors.

It is interesting to note that Bell Labs was founded by Alexander Graham Bell, inventor of the telephone, who initially called it the Volta Laboratory. Possibly because Volta was the first to use the word semiconducting.

Point-contact transistors were very fragile and unstable. In the following years Shockley invented the grown-junction transistor that eventually evolved into the bipolar junction transistor.

Here’s a nice 6-minute video explanation of how modern transistors actually work:

The 1950s saw the rise of transistorized consumer products that were portable and low on power consumption. Most notable among them were hearing aids (1952) and transistor radios (1954).

Again, due to government and corporate demand transistorized computers lived on till the end of the 1960s.

The Integrated Circuit

In 1954, a team of researchers at the Texas Instruments’ Central Research Laboratories (TI CRL) lead by Gordon K. Teal created the first silicon transistor.

Silicon is widely available in the form of silicate minerals. At 27.7%, it is the second most abundant element in the Earth’s crust after oxygen. Thus, it is much cheaper to produce technology-grade silicon than germanium.

A few years later, in 1957, another TI CRL employee, Jack Kilby created the first integrated circuit.

integrated-circuit

Source: www.freeimageslive.co.uk/free_stock_image/electroniccircuitsjpg

Silicon Valley

In 1956, Shockley the Nobel laureate decided to move to Mountain View, California and to start Shockley Semiconductor Laboratory. Unfortunately, his personality again proved to be his greatest enemy when eight of the most talented men he recruited left to form Fairchild Semiconductor.

Among the many super-successful spin-off companies of Fairchild Semiconductor were Advanced Micro Devices (AMD) and Intel.

Around the same time that Shockley moved to California, Frederick Terman, then Stanford’s dean of engineering motivated his students to start up their tech companies.

stanford-aerial-view

Source: www.bing.com/maps/

He also established the Stanford Research Park in 1951 and succeeded in bringing high-tech companies such as Hewlett-Packard, Eastman Kodak, General Electric, or Lockheed to the park.

These two men, Shockley and Terman, are now recognized as the fathers of Silicon Valley.

silicon-valley

Source: mapcollection.wordpress.com/blog/tag/silicon-valley/

Electronics for the Masses

What started with smaller hearing aids and pocket-size transistor radios, bloomed into a mass production of portable electronics and ushered in a completely new, modern culture of rock-and-roll in the 1960s.

In 1965, Intel’s co-founder Gordon Moore published his observation that the number of transistors put on an integrated circuit would double every year. Ten years later he corrected this to be every two years.

The prolific Jack Kilby created the pocket calculator (1967) and several more useful items. He received his Nobel Prize in physics a bit belatedly in 2000.

In 1968, Hewlett-Packard first used the term personal computer in an advert for the 911A calculator.

In 1971, Intel started producing the 4004 microprocessor.

intel-4004

Source: www-ssl.intel.com/content/www/us/en/history/museum-story-of-intel-4004.html

The Intel 4004 was the first general purpose programmable processor on the market that any engineer could buy and use in their projects.

Creating Ever Smaller Computers

In 1955, the TX-0 was built at the MIT Lincoln Laboratory

tx-0

Source: bitsavers.trailing-edge.com/pdf/mit/tx-0/pics/

The TX-0 was a transistorized version of the vacuum tube military computer called Whirlwind, but instead of filling an entire floor of a large building, the TX-0 fit into a reasonably sized room. Yet, it was faster than Whirlwind.

In 1958, the TX-2 was built at MIT Lincoln Laboratory.

In 1959, Digital Equipment Corporation (DEC) created a smaller (but still quite large) computer inspired by the TX-2 called PDP-1 — the Programmed Data Processor.

In 1964, DEC unveiled its PDP-6 computer with the Monitor software, which was later renamed to TOPS-10.

pdp-6

Source: www.computer-history.info/Page4.dir/pages/PDP.6.dir/

In 1967, DEC created the PDP-10 mainframe computer, later renamed to DECsystem-10, which came with the TOPS-10 operating system.

The Personal Computer

With Intel’s introduction of the 4004 microprocessor, the prices of these chips started to become accessible to the masses.

In 1974, MITS came out with the Altair 8800 microcomputer based on Intel’s 4004.

altair-8800

Source: en.wikipedia.org/wiki/Altair_8800

Hobbyists liked the Altair 8800 because it was easily expandable. It was also the first product for which Microsoft wrote code, specifically the Altair BASIC interpreter.

In 1975, Olivetti presented the first pre-assembled personal computer — the P6060.

olivetti-p6060

Source: www.old-computers.com/museum/photos.asp?t=1&c=407&st=1

In 1976, the Apple I computer was hand-built by Steve Wozniak while his friend Steve Jobs arranged the first sale of 50 units.

apple-i

Source: en.wikipedia.org/wiki/Apple_I

In 1981, after missing the start up phase of the personal computer era, IBM introduced its IBM PC. The IBM PC became an instant success and together with Microsoft’s MS-DOS ruled the world of personal computers for the next five years.

ibm-pc-xt

Source: en.wikipedia.org/wiki/IBM_Personal_Computer_XT

In 1983, IBM came out with the IBM PC/XT and a year later with the IBM PC/AT. The XT shipped with DOS 2.0 and Intel’s 8088 CPU. The AT shipped with DOS 3.0 and Intel’s 80286 CPU.

IBM’s reign of personal computers ended in 1986 when Compaq designed and manufactured the first PC based on Intel’s new 80386 microprocessor.

compaq-80386

Source: imgur.com/gallery/i7RH3

Others saw the opportunity and acted on it. Commodore International started selling its Amiga computers that were a significant upgrade from the low-end Commodore-64.

Atari and the higher-end Apple Macintosh were not far behind.

In 1990, Microsoft released its Windows 3.0 operating system that allowed for multitasking of old MS-DOS programs. Together with Intel’s CPUs, the two companies continue to dominate large parts of the computing market up until today.

Market Fragmentation

Further miniaturization of personal computers and the creation of innovative computing devices fragmented the once monolithic market. Other companies now rule various parts of these many markets — Google (Android) the mobile operating systems market; Samsung and Apple the smartphone and tablet markets; or Lenovo, HP, and Dell the desktop, laptop, and netbook markets.

However, both Microsoft and Intel retain massive dominance in their core markets. Intel has more than 80% share in the microprocessor market and more than 90% of desktops, laptops, and netbooks use Windows operating systems.

As computers get even smaller with laptops, netbooks, tablets, smartphones, or microcomputers such as the Raspberry Pi, the possibilities of the currently used CMOS transistor technology are prone to hit the 10 nanometer wall.

The newest Intel Skylake CPU, released in August 2015, uses a 14 nanometer transistor fabrication technology.

It seems that silicon will be replaced with more complex materials below the 10 nanometer technological node.

Further Reading

Liked this post?

Subscribe to our newsletter to receive early notification of new posts and deals:

Next Post »« Previous Post

Leave a Reply

Your email address will not be published. Required fields are marked *