NARRATOR: It has
revolutionized the way we work, play, travel, and communicate. It touches almost every
part of our lives. It has helped win wars,
solve insoluble problems, and send us into space. Its invention is the story
of squandered chances, fortunate accidents,
frequent missteps, and unprecedented genius. Now, the creation of the
computer on "Modern Marvels". [music playing] Our world is increasingly filled
with countless wonders that would not have been
possible without one machine, the computer. Although computers are
enormously intricate, their most basic components
consist of simple devices that can be switched to either
one of two states, on or off. The computer creates its magic
by calculating with a speed and accuracy that far
surpasses its human inventors. Computers stagger the mind
with their complexity, but simply put, a computer
takes information, processes it, and then outputs
a result. It's all done with a unique partnership
of hardware and software. Hardware comes in boxes. They're the physical
components, such as the monitor and hard drive. Software comes on disks. Software consists
of instructions that tell the computer what to do. One way to start this
computational partnership is to type on the
keyboard, providing input. The input is picked up by the
central processing unit or CPU, the computer's brain. Using instructions
provided by software, the CPU processes the input. The magic behind the CPU
is its blinding speed. Modern processors are
measured and MIPS, millions of instructions per second. While processing,
the CPU may receive data stored in random
access memory known as RAM, or data stored on hard drive. Modern RAM is so quick
that every second, it can send the equivalent
of 10,000 typewritten pages of information to the CPU. And modern hard drives can store
the equivalent of 250,000 pages of typewritten material. After processing, the CPU
outputs information, often on a monitor. The whole procedure
is usually so quick that it appears instantaneous. Today, computers are so
commonplace we take them for granted. But not long ago, computers
only existed in the imagination of a few visionaries. The search for a machine
that could figure quickly and accurately has seized
the human imagination for thousands of years. In fact, the
computer's family tree has roots so deep in the past
it is impossible to know exactly where they begin. By the early 19th century, the
European Industrial Revolution was well underway, and the
development and production and commerce came from the
maturing fields of engineering, navigation, surveying,
finance, and science. The practical application
of these fields relied on volume after
volume of tables, tables for trigonometry, tides,
interest rates, multiplication, and gravity. Tables were critical. The actual figuring was done
by people who specialized in mathematical computation. Surprisingly, these people
had a familiar job title. They were called computers. These human computers
toiled over their tables incessantly, monotonously,
and made mistakes. Typically, tables
were full of errors. The requirement
for accurate tables introduced one of the most
eccentric and brilliant figures into the story of
computers, Charles Babbage. Oh, Babbage was an
extraordinary scientist. I mean, Babbage was
a great scientist. Babbage was 100 years
ahead of his time. You can't say that
about many people, but you could say
that about Babbage. NARRATOR: Charles
Babbage represented that extraordinary element of
British society, the scientist aristocrat. Many were known for
their eccentricities, and Babbage was no exception. As a youth, Babbage
devised footwear of hinged boards intended to
allow him to walk on water. Never one to shirk adventure,
he tried them out himself, but flipped over
and nearly drowned. Babbage demonstrated his
brilliance in mathematics while attending Trinity
College in Cambridge. In 1820, Babbage was checking
the accuracy of calculations made for the Royal
Astronomical Society and kept finding errors. He reasoned that a machine
could be constructed that would calculate the tables
and directly print the results. He called the machine
the Difference Engine. He drew up plans for a
section of the device and had it built with
his own funds in 1822. Babbage couldn't pay
for the construction of the entire device, but since
the greatest beneficiaries would be the British
government and people, he made the extraordinary step
of petitioning the government for a grant. In 1823, the Treasury provided
the project with startup funds. Government support for
the computer industry is nothing new. It's very much a big
topic in the news today, and it will continue to be. Computing is an
expensive proposition, and it usually requires
some government support if it's going to get anywhere. NARRATOR: Babbage hired
a mechanical engineer, set to work on a complete design
for the Difference Engine, and immediately ran
into difficulties. The mechanical machine
shops of the time were not advanced
enough to produce parts in the precise measurements
that Babbage's plans required. So Babbage designed
better machine tools, which would eventually
improve the entire state of British tool manufacturing. By 1829, Babbage had
spent the 1500-pound grant from the government,
and even more than that from his own funds. But only a few bits and pieces
of machine had been completed. Babbage's project began
to attract critics. He was plagued by several
problems, one of his problems being his perfectionism,
another problem being that his work was not
understood or appreciated by the people of his time. NARRATOR: Babbage
had many enemies. Even London's organ
grinders despised him because he had tried to have
them banned as their music interfered with his thinking. But the inventor
continued to toil. And finally, in 1832,
there were enough parts to assemble a section
of the engine. It functioned perfectly,
solving equations and producing six-digit results. But it was only a small part
of the proposed machine. The skyrocketing costs
and lack of results finally made the government pull
its support from the project. Although disappointed
by the cancellation, Babbage had contributed
to the project's demise by suggesting that a new
device he had conceived, the Analytical Engine,
would be vastly superior to his old design. Babbage, in
hindsight, probably should have finished
the Difference Engine and seen how far he
could have gone with that before starting the
Analytical Engine. There's no question that
the Analytical Engine was more than he could handle. NARRATOR: Babbage was
obsessed with this new idea. With the Analytical Engine,
Babbage asked himself, why not build a
machine that could solve any mathematical problem? At the age of 43,
Babbage had the vision of a computer, a
vision he pursued for the rest of his life. The extraordinary fact is
that Babbage's overall design for the analytical engine
had many components analogous to those
in a modern computer. The heart of the
machine, the mill, made the calculations, like
the central processing unit of modern computers. An oblong structure,
the store held numbers to be used in the calculations
like modern computer memory. Instructions and numbers could
be fed into the machine using punch cards. Much of what we know
about the workings of the analytical engine
came from the writings of Ada, Countess of Lovelace. PAUL CERUZZI: Among the people
who understood what Babbage was doing was a woman
named Ada Augusta, who was the daughter of one
of the daughters of Lord Byron, the poet. She had studied
mathematics as a child and had quite a bit of talent. NARRATOR: Ada met Babbage at
one of his famous dinner parties that were often attended by the
luminaries of British science and engineering. Babbage demonstrated the working
section of the Difference Engine for her, and she was
immediately captivated by it. She published a simple
description of Babbage's vision for the Analytical Engine. PAUL CERUZZI: Ada wrote
some descriptions of it, and she also appended
to these descriptions a hypothetical way that
this machine could solve an equation. And on the basis of
those descriptions, people often call her the
world's first programmer. NARRATOR: But she was never
to program a real machine. As Babbage entered the
last years of his life, his great work was unfinished. He had become cranky and
suffered constant attack by his many enemies. In 1871, when London's
organ grinders discovered that Babbage was ill,
they surrounded his house and serenaded him, increasing
his agony until he died. Only a small portion of
the Analytical Engine was built in Babbage's lifetime. Babbage's vision of the
computer fell into obscurity, and except for the
detailed texts left by Ada, could well have been forgotten. Babbage's machines,
which were never finished, which existed, for the
most part, only on paper, were protocomputers. They were mechanical. They used gears. They used metal shafts. They weren't computers in our
sense of the term-- that is, they weren't electronic
digital computers-- but they were, abstractly and
on paper, mechanical computers. NARRATOR: It would
be nearly 100 years before a programmable
computation device would again be conceived. Babbage predicted it would take
just three years to complete the Difference Engine. It actually took him 14. "Computers" will continue in
a moment on "Modern Marvels". In the second half
of the 19th century, America's population
increased 35% each decade. America's exploding
population began to endanger one of its
great institutions, the American census. PAUL CERUZZI: The census, which
is required by the Constitution to be held every
10 years, was still being done by
old-fashioned people making check marks
on pieces of paper, and it simply couldn't keep
up with the tremendous surge of population in the US. NARRATOR: The crisis
reached a head in 1887. The Census Bureau was still
hand tallying the data from the 1880s census. Desperate for relief,
the bureau pleaded for any method that could speed
up the counting of the 1890 census. The Superintendent
of the Census had proposals for three systems, so
he decided to stage a contest. Two of the systems
relied on hand counting. The third, developed by a young,
rather humorless former MIT instructor named Herman
Hollerith, used punched cards. Punch cards would one day
become the standard method of feeding high volumes
of data into computers. PAUL CERUZZI: Now where he
got this idea we are not sure. He may have been
inspired by the fact that a conductor on a railroad
punches your ticket when you hand it to him. Hollerith's system
beat the others easily. In the tabulation
portion of the test, it was nearly 10 times faster. The Census Bureau leased
56 of Hollerith's machines at $1,000 a year each and put
them to work in July, 1890. Census Bureau clerks
used Hollerith's machines to punch the cards and
then tabulate the results. Scores of operators were trained
to use the puncher quickly and accurately. The tabulating was
done with electricity. A metal pin that passed
through a card hole made electrical contact
with a cup of mercury, completing a circuit that was
registered on a tallying device that consisted of rows
of clock-like dials. Hollerith's machines were a step
toward the later development of computers. They significantly sped up
the processing of information. The results of the
1890 census count were a triumph for Hollerith. In just six weeks, the
population count of 62,622,250 was tallied. Hollerith became the talk
of the scientific community. He rented an office and
set himself up in business. He called his new enterprise
the Tabulating Machine Company. Hollerith had the Census
Bureau business in his pocket, and the future looked bright. But it turned out to be
harder than it seemed. Hollerith's natural aptitude for
mechanical devices was obvious, but he also proved himself
to be a dogged business man. He drummed up business among
one of the biggest industries of the day, the railroads. With the increase in
population and the push west, the railroads had grown
into enormous organizations with personnel, stations,
cars, and customers scattered all
across the country. Hundreds of clerks
produced tons of paper to help track and manage
these vast empires. Hollerith convinced the New
York Central Railroad to try out some of his machines. The experiment wasn't a success. Hollerith's machines could
compute fast enough for census work, but couldn't keep up
with the speed and volume of the railroad business. After three months, the
machines were removed. Hollerith was short on
capital and faced ruin. He moved his family into
his mother-in-law's house. He sold his assets,
even his horse, to raise money to redesign
his machines to improve their speed, reliability, and
ability to make additions. Hollerith even customized
the punch cards for business computations, such as adding
columns to store dollars and cents. After a solid year
of tedious work, Hollerith returned
to New York Central and offered them free use of
his new, improved, and faster computing machines for a year. Within three
months, the railroad was convinced and contracted
to lease the machines. The Tabulating Machine
Company was back on track. Hollerith had avoided
bankruptcy and now had more work than
he could handle. PAUL CERUZZI: And
Hollerith, you could say, came along just in time. It was a combination
of his invention making this available, but
also the need out there required something like that. So it was a convergence
of the social needs or the social factors,
on the one hand, and the inventiveness sort of
pushing from the other hand. NARRATOR: But
Hollerith was weary. He was diagnosed with a bad
heart and ordered to slow down. In 1911, Hollerith
sold his shares in the Tabulating Machine
Company for over $1 million. Hollerith's former company
was merged with three others and, led by master
salesman Thomas Watson, grew into a major supplier
of business equipment. In 1924, Watson
renamed the enterprise International Business
Machines, IBM. Because of Hollerith, the name
IBM would become synonymous with computers. By the 1930s, as America limped
out of the Great Depression, companies like Burrows and
IBM foresaw continued growth and success. Over the next decade,
progress would be slow. It would take the destructive
forces of World War II to give the computer
its next great advance. Herman Hollerith
remarked, "I will have, in future years,
the satisfaction of being the first
statistical engineer." He also had the satisfaction
of becoming the first computer millionaire. "Computers" will continue in
a moment on "Modern Marvels". World War II spurred
the development of the true computer,
and in the turbulent days before the German
blitzkrieg smashed Poland, a young Polish engineer walked
into the British Embassy in Warsaw and made an
astounding proposition. He offered to sell
the British the secret to the unbreakable German
code machine, the Enigma. The British desperately wanted
to crack the Enigma machine used by German commanders
to encrypt their most secret military radio messages. British intelligence
supplied the engineer with a fake diplomatic passport
and smuggled him out of Warsaw. While guarded by
French agents in Paris, the engineer provided
details on the code machine's ingenious operation. In the Enigma, plugs
were rearranged to conform to that day's
code book combination. The power of Enigma was
that this plug arrangement constantly varied how
letters were coded throughout the transmission. The number of letter
variations was astronomical, so high the Germans considered
their code machine to be unbreakable. But the British now knew
how the machine worked. They realized that
they could very quickly try different key combinations
on a small part of the code. Then, when that small part was
broken and the key revealed, the rest could be
decoded effortlessly. North of London, at a secret
installation in Bletchley Park, British code breakers built
a computer like machine to do just that, Colossus. Colossus used over
2,000 vacuum tubes to process 25,000
characters per second. Colossus could only
do one job, but it could compute very quickly. Its deciphered
German transmissions were called the Ultra Secret,
the most closely guarded secret of the war. While the British were now
able to read German messages, the question was how to take
advantage of the secrets without tipping off the Germans
that their code had been compromised. The most senior
Allied Commanders were privy to Ultra material,
but had to exercise caution in reacting to it so as
not to tip their hand. Ultra information
was never revealed to anyone in a position to
be captured by the enemy. Field commanders often went
into battle lacking information on the enemy that was
known to their superiors from Ultra dispatches. This secrecy may have also
kept Colossus from prominence in the history of computers. While Colossus was
breaking German codes, across the Atlantic another
computational device was under construction. The machine which would
directly influence the design of all
future computers was being built in Philadelphia. In response to the
attack on Pearl Harbor, American industry quickly
became a powerhouse producer of the implements of war. But by 1943, there was
a critical shortage of a surprising component of
the war machine, firing tables for artillery pieces. Firing tables allowed gunners
to correctly aim their guns in different ranges, altitudes,
temperatures, and wind conditions. PAUL CERUZZI: To calculate
these tables required enormous numbers of
calculations, which, at that time, were
done by human beings. Incidentally, these people who
did them were called computers. That was their job title. And they operated
adding machines, mechanical adding
machines primarily. And they would simply step
through these calculations and produce these tables. NARRATOR: One of the centers
of firing table calculation was at the Moore
School of Engineering at the University of
Pennsylvania in Philadelphia. Midway through the
war, it became clear that the tables could not
be produced fast enough. This was a crisis. Without tables, new guns could
not be shipped to the troops overseas. To break the bottleneck, a Moore
School physicist, John Mockley, made a fantastic proposal. He suggested that he could build
a giant electronic computer that would be able to figure
a single trajectory in 100 seconds. The Army, desperate for a
device to help them win the war, reluctantly committed to the
proposed cost of half a million dollars. Mockley and a brilliant
graduate student in electrical engineering,
Presper Eckert, set to work constructing
ENIAC, the Electronic Numeric Integrator and Computer. Driven by the knowledge that
friends and relatives were dying in battle while they
worked in Philadelphia, the team of young engineers
toiled incessantly. But could they create
such a monster? Everything had to be
invented from square one. And then they had to build it,
and then they had to test it, and they had to
put it all together and make it work reliably. And then they had to
learn how to program it. NARRATOR: Nothing close to
ENIAC had ever been conceived. Nearly 100 feet long
and weighing 30 tons, it contained almost 70,000
resistors, 10,000 capacitors, 6,000 switches, and 18,000
delicate vacuum tubes. Vacuum tubes burn out
just like light bulbs. In a machine that contained
18,000 vacuum tubes, it was likely that at least
one would always be burned out, crippling the machine. Presper Eckert found the key
to making ENIAC function. He had the vacuum tubes
built to high tolerance, he critically tested them, and
then he ran them at low power. If you took
these measures, you might be able to get the machine
to work for 10 minutes, half an hour at a time. Since the machine
calculated so quickly, you can get a lot of
work done in a half hour. And so that's what happened. NARRATOR: After two
years of intense work, ENIAC was complete a few months
after the Japanese surrender. Although it wasn't finished
in time to help win the war, ENIAC was a marvelous machine. Huge and hot, it
could perform up to 5,000 additions, 357
multiplications, and 38 divisions every second. By far and away the most
complex machine of its time, ENIAC still lacked many of the
qualities of a modern computer. Its memory was very primitive. It had to be laboriously rewired
each time it was programmed and couldn't make
logical decisions based on its calculations. But with tremendous
expenditures of time and money, ENIAC had proved that
computers could be constructed. However, except for arcane
scientific calculation, did anyone really want them? The question lingered, could
anyone build a really practical computer? During development of the
Mark II computer in 1945, a relay inside the
computer failed and researchers found
a dead moth inside. This is the origin of
the computer terms bug and debugging. "Computers" will return in a
moment on "Modern Marvels". Just before the end of
the Second World War, an advisor to the ENIAC
project, John von Neumann, wrote a paper that was to
greatly affect the next stage of computer design. Von Neumann possessed
a photographic memory, an incomparably
fast mind, and was one of the principal scientists
involved in the Manhattan Project, the building
of the atomic bomb. He was also an advisor on ENIAC. The paper von Neumann wrote
after the war delineated the structure of
a modern computer. The paper drew heavily on
the work building the ENIAC, yet was undeniably augmented
by von Neumann's brilliance. Von Neumann's computer was
to have a processing unit, a controlling unit,
memory, input and output. But most importantly in
the evolution of computers, it would hold its programming
internally in its memory. Internally held programs
give computers their power and versatility because an
internal program can modify what it does based on data or
the results of computations. In machines like the
ENIAC, programming had been hardwired or
fixed, so the machine was much less adaptable. The idea for storing
the program internally was the last key to
developing the true computer. But whether it was
von Neumann's idea has long been hotly debated. Eckert and Mockley claimed they
formulated internal programs as a natural part of
their work building ENIAC, although they couldn't
stop and incorporate the idea into the machine. But many who read
the paper assumed that all the genius behind it
was from the great von Neumann. He was one of the most
widely regarded mathematicians in the world. Eckerd and Mockley were
relatively obscure. Eckert was a young man
just out of school. Mockley had been a professor
at a fairly out-of-the-way college. They didn't have
international reputation. NARRATOR: Eckert and
Mockley weren't included as authors in the paper. They felt they
had been betrayed. The most important effect
of von Neumann's paper was to spur computer
development. Eckert and Mockley moved
into offices in Philadelphia, hired a staff, and set up
a company that would build a business computer. They called it the UNIVAC. They signed a
fixed cost contract to build a UNIVAC for
the Census Bureau, rolled up their sleeves,
and went to work. Unfortunately, building
UNIVAC turned out to be a monumental undertaking. As they struggled
to make UNIVAC real, they also struggled
financially and needed to be bailed out by a
series of larger companies. They eventually joined
forces with Remington Rand, a flourishing
typewriter manufacturer. In March of 1951, after
six years of toil, they finally delivered the first
UNIVAC to the Census Bureau. Unlike ENIAC, the UNIVAC was an
entire computer system designed for business. UNIVAC could be programmed for
a variety of data processing tasks. Compact tape drives held data,
and results were automatically printed. But even with the backing
of Remington Rand, sales remained slow. Very few people understood how
useful a computer could be. That perception changed
dramatically one night in 1952. [music and cheering] In a brilliant public
relations move, Remington Rand arranged with
CBS to use a UNIVAC on election night to predict the outcome
of the presidential race between Dwight Eisenhower
and Adlai Stevenson. Polls said the race
was too close to call. No one had ever
programmed a computer to make electoral predictions. Eckert and Mockley's engineers
entered their customized algorithms right until airtime. The operators fed in the results
of selected eastern precincts, and at 9:00 PM
ran their program. UNIVAC predicted a
landslide for Eisenhower, but the polls said differently. The operators didn't believe
what UNIVAC was telling them and assumed their programming
was at fault. Quickly they reprogrammed the
machine to better reflect what the experts predicted. As the night went on, it became
clear that Eisenhower would, indeed, win by a landslide. CBS sheepishly announced they
hadn't believed the machines. When all the votes were tallied,
UNIVAC's initial prediction was off by less than
1% of the final result, an extraordinary prediction
even by today's standards. The power and utility of the
computer had been proved. After the success of
UNIVAC, various companies began to see a future
in computer development. New companies like
Burrows, as well as old giants like General
Electric, jumped into the computer business. But most large
American businesses were dependent on the
data processing systems provided by one company, the
office machine monolith, IBM. IBM had no computers. IBM's aging leader, the
legendary sales guru, Thomas Watson Senior, was not eager to
jump into the enormously costly development of computers. A computer back then contained
thousands of vacuum tubes, occupied one or
more large rooms, and required a small army
of attendants to run. IBM and most other people didn't
see how computers could be used in business. NARRATOR: It took Watson's
son, Tom Watson, Jr., to lead IBM into
the computer age. As Watson himself
recalls, the move was spurred by IBM's
customers, who were fed up with bulky punch cards. I remember particularly Jim
Madden, then Vice President of Metropolitan Life, who said
we're going to cancel our IBM just at the minute we
learn to do this in tapes because three floors of the
Metropolitan Life Building are used to store the cards
of our customers' accounts. And if we keep going
the way we will, they'll occupy the
whole building. We were threatened
into this progress. NARRATOR: Faced with the
prospect of losing customers, Tom Watson, Jr. Ordered the
development of a computer. IBM's famous sales force
told their customers a computer would arrive soon. In fact, it was
still being planned. Finally, in 1953,
IBM unveiled the 701. Although technologically
inferior to the UNIVAC, the 701 and other
early IBM computers were hits with the customers
because they conformed to standard IBM
systems and support. IBM saw the future,
and it was computers. They redirected
corporate efforts to computer development. IBM's efforts paid off, and
by the beginning of the 1960s, the company's large
mainframe computers dominated the business. But the development
of the computer was about to go in an
altogether different direction. [rocket blastoff] We choose to go to
the moon in this decade and do the other things,
not because they are easy, but because they are hard. NARRATOR: In 1961,
America was trailing Russia in the race for space. ANNOUNCER: 1961, a
year of achievement for Soviet scientists
in the race for space. Yuri Gagarin has become the
first human to orbit the Earth, and crowds in
Moscow's Red Square salute the
27-year-old cosmonaut. NARRATOR: As NASA engineers
began planning the lunar mission, they
realized a computer as powerful as one currently the
size of a room must be onboard. The engineers wondered, is such
a small computer even possible? The first great breakthrough
that would lead to computer miniaturization had already
been made on December 23, 1947, when three scientists at Bell
Labs, William Shockley, Walter Brattain, and John Bardeen
invented the transistor. Formed on the
semiconductor silicon, the transistor could
replace large vacuum tubes in computers. Compared to vacuum
tubes, transistors were tiny, required
little power, and produced little heat. The breakthrough was
sufficiently important that the three inventors
of the transistor were awarded the Nobel Prize. A computer that could
navigate to the moon and back would require thousands
of transistors, and although small, they
were not nearly small enough. The next step in miniaturization
occurred in 1959, when Robert Noyce and
Jack Kilby, engineers for rival transistor
manufacturers, independently came up with
breakthroughs that led to the same revolutionary idea. An entire network of
electronic components, transistors, diodes,
capacitors, and resistors, could be incorporated onto
a single chip of silicon. The great innovation
in electronics was called the
integrated circuit. Using integrated circuits,
a 10-ounce computer was built that was as powerful
as a 30-pound one made of transistors. But integrated circuits
had an inherent problem. They were difficult
to manufacture and therefore were expensive. But the space race had started
just in time to pay for them. In 1969, 5,000
integrated circuits made up the heart of each
of two identical computers, one on the Lunar Orbiter,
and one on the Lander. For their size, these were
the most powerful computers on Earth, soon to leave
the Earth entirely. MICHAEL COLLINS: Beautiful. Just beautiful. NARRATOR: As Neil Armstrong
took one small step in the lunar dust, Intel engineer Ted Hoff
was making the last great leap in miniaturization,
developing an idea that would put an entire computer
on a chip of silicon, the microprocessor. The genie would be
out of its bottle, beginning the
computer revolution and changing the world forever. [music playing] IBM took the lead in
computer sales in 1956 from Remington Rand by
selling just 76 computers. "Modern Marvels" will return. JAMES LOVELL: OK, Houston. We've had a problem here. NARRATOR: Apollo 13, intended
as the third lunar landing, has just lost two fuel cells and
was venting oxygen into space 200,000 miles from Earth. Soon the second oxygen tank
would begin losing pressure. The astronauts would die
unless they could precisely align their spacecraft
and fire their rocket to slingshot themselves around
the moon and back to Earth. Critical to the alignment was
Apollo's status-of-the-art guidance computer. A new trajectory was figured. The all or nothing
rocket firing was made. The Apollo 13 crew
miraculously returned to Earth with the help of their
small and powerful computer. By the last Apollo mission
two and 1/2 years later, computers as powerful
as those on Apollo would be available to everyone. That was because soon before the
Apollo 13 flight, an engineer at Intel, Ted Hoff, had come
up with an ingenious idea. Hoff had been told to design
12 separate integrated circuits to make a Japanese
pocket calculator. He suggested placing the entire
processing unit on a chip and programming it
just like a computer. Intel developed the idea, and by
1970, they had a working model of a microprocessor. It was the invention not
just of integrated circuits, but of a particular kind
of integrated circuit, the microprocessor,
that makes today's personal computers possible. NARRATOR: Smaller than a
fingernail, a microprocessor contains many of the
components of the computer, including a control
unit, a clock, and areas where data can be
stored and modified. Processing power was about
to become very cheap and very compact. In the mid-1970s, two friends,
Steven Wozniak and Steve Jobs, were manufacturing a small
computer in a Palo Alto garage. STAN AUGARTEN: Steve Jobs
was a college dropout, but he was a college
dropout with a difference. He was very intelligent, he
had a lot of street smarts, and he was also
extremely ambitious. I think perhaps more
important than that, he knew the other
Steve, Steve Wozniak, had created something
exceptional. NARRATOR: Steve Jobs
sold his Volkswagen, and Steven Wozniak
sold his HP calculator to finance their company
that would revolutionize the computer industry. Jobs trekked all over the
San Francisco Bay Area to find buyers for the $500
machine, which they called the Apple I. The Apple I
was large and unwieldy. Jobs realized they needed
a new design for a computer that anyone could use. Wozniak began to
build the Apple II. The fate of Apple changed
dramatically when, in the fall of 1976, a
visitor to Wozniak's garage, saw the prototype
of the Apple II. The visitor was Mike Markkula,
who, at 32, had retired from Intel a millionaire. He was so impressed
by the Apple II that he joined Apple and put
it on a sound business footing. The Apple II was
introduced to the public, and sales skyrocketed. But in 1978, even with the
success of the Apple II, using a computer wasn't easy. The early Apple
computers, as well as all early personal
computers, did not have graphical interfaces. They didn't have mice. They had what's known as
a command line interface. That is, you typed
instructions into the computer, and your instructions appeared
as text on the screen. NARRATOR: A
simple-to-use computer had been conceived by a computer
scientist named Doug Engelbart. He demonstrated his vision in
1968 at the Fall Joint Computer Conference in San Francisco. DOUG ENGELBART: If I hit
W, it'll say delete word. The arrow moves back and
forth to give me feedback. NARRATOR: Wielding a keyboard
and a pointing device he called a mouse, Engelbart
worked with a computer 30 miles away
linked by microwaves and demonstrated word
processing and hypertext. Many in the audience
went home inspired, but one group alone was to
fulfill Engelbart's vision. That group was
just down the road from San Francisco
at Xerox Park. In 1970, Xerox dominated
the copier industry, but thought the future
might be in computers. A young Xerox executive, Robert
Taylor, worked with a team to transform the way the
computer industry was perceived. As Robert Taylor
himself remembers. The chairman of Xerox at
the time, Peter McCullough, made a speech where
he said that Xerox was going to become the
architecture of information. So I asked his
speechwriter not too long after that, what did that
mean, because I had an idea about what it ought to mean. And so did some other people. And the speechwriter said,
well, he didn't really know, but it's a ringing phrase. And so I said, well, we're
going to make it happen. NARRATOR: Robert Taylor
hired many of the country's top computer scientists
and challenged them to create an easy-to-use
personal computer. The result was the Alto,
which incorporated many of the innovations in personal
computers we take for granted today, all developed
at Xerox Park. The alto used a mouse,
a graphical interface, built-in networking and
printed on a laser printer. Xerox developed the Star, the
commercial model of the Alto, but it never sold well. STAN AUGARTEN: Xerox was a
large and very successful photocopier company,
and it didn't really understand computers,
didn't appreciate the brilliance, the originality,
and the enormous commercial worth of the computer
developments at Xerox Park. NARRATOR: But Steve Jobs did
when he visited Xerox Park in 1979 and saw the Alto. He returned to Apple
and immediately set to work on what would
become the Macintosh computer, the first
popular personal computer similar to those used today. What made the
Macintosh easy to use was its operating
system and applications, otherwise known as software. Increasingly, software was
dominating the advances made in computers. Bill Gates, the young president
of a computing software company, Microsoft, understood
the importance of software to the future of computers
and parlayed this vision into a vast software empire,
making him the richest man in the world. The dream of a machine
that could think had come from a
mechanical device to an electronic one, where the
bits of coded information that ran it were as important,
perhaps more important, than the machine
that they controlled. This may have been the best
evidence that a thinking machine had arrived and could
now be placed on your desk. Soon nearly half the jobs in
America would use the computer. Within the decade,
microprocessors would be everywhere, incorporated
into automobiles, appliances, and scientific instruments,
significantly increasing their capability
and reliability, unprecedented fortunes would
be made in businesses that hadn't existed a decade before. Education would be
transformed forever as the access to information
would be transferred from libraries and
universities to your desktop. The earth. Would shrink with
unprecedented speed as a worldwide
communication grid became accessible to anyone with
a computer and modem. And even now, the evolution
of a thinking machine isn't finished. Computers have already
changed the way we live, and they'll change the
way we explore our world and other worlds in
the 21st century. They will take us
to distant galaxies, and they will connect
us right here on Earth. Computers will continue to
become more interconnected. They will continue to become
smaller, faster, cheaper, and software will
become more powerful. Minds that have yet to be
formed will mold this power to create new marvels
inconceivable to us today.