>> --always wanted
to do this, and when you get ready to
do something like this, after you get ready
for a while, you begin to wonder if
it was such a good idea. (audience laughing)
Because, I mean, the "50 minutes" thing has
got me really nervous. There's probably about 50 weeks'
worth of material here, which will now be distilled
in 50 minutes or less. The reason for the 50 minutes
is because I'd like to have a few minutes at
the end for questions. If you have a question that you
just can't-- you can't survive unless you ask it immediately,
you may certainly do so, and I may give an answer,
or I may say, "That's a great question--
I'll answer it at the end," Or I may give the answer
that I give most of the time, now that I am department head,
which is, "I have no idea." (audience laughing) Um, when you're gonna
give a talk like this, you gotta figure out
where you're gonna start. And... okay, so, mathematics--
basic mathematics-- counting or measuring things,
measuring areas, measuring lengths. Somebody had to do that
first, but we don't know who, we don't know where,
we don't know when. And you know, there are
some extremely deep, fundamental ideas here that
we're probably never gonna know the answer to. I mean, it was genius
for someone to realize that if you had this,
and if you had... those, and
if you had... these... that somehow, they were
all related by something that we now
call "five." And to actually realize that
there was that uniting theme that needed a name and
a notation, that's deep. And we have no
idea who did that. And anthropologists
might have theories, but that's way out
of my realms. So, we don't know
who did that, so we'll start
someplace where we have some
idea of what went on. When you talk about
early civilizations-- Egypt, uh, Mesopotamia,
the Babylonians, the present-day Iraq,
Iran, Syria-- that area-- the Tigris and
Euphrates... the cradle of
civilization. Uh, China and India
come to mind. Now, these all had
certain things in common. They all dealt with
arithmetic and geometry-- in some cases,
fairly advanced. Babylonian geometry actually
got pretty intricate. They knew
the Pythagorean Theorem a long time before
Pythagoras was around. China was using
the equivalent of our decimal
fractions-- not with our notation,
of course-- um, several hundred
years BC, which is about
1,500 years before they were
used in Europe. And India-- the numerals
that we use are called "Hindu-Arabic numerals" because they began
in India probably around
the 3rd century BC. So, they worked
with arithmetic, they did some amazing
things with geometry, considering
the era, but one thing that is notable
about all of the mathematics from this era is there is not one instance,
anywhere, of anything that would be considered
a formal proof, or even a discussion
of a demonstration. "Here's why something works"--
they never did that. Never. It was, "Here's
a problem. "Do this, do this, do this,
do this, do this," and they often ended
with something like, "And you will see
that it is right." They never
explained why. It was just something you
were supposed to "see." >> That's not
mathematics, is it? I'm sorry...
(audience laughing) >> There's a heckler
in every crowd. (audience laughing)
Um... and the mathematics that they
did was often very practical-- I mean, commerce and finding
areas for land and what not, but it was often
recreational. You find problems that
were clearly posed merely to be puzzles or to maybe see if you
could stump someone else-- you know, like the word problems
we give our algebra students. Now, for those of us who
are a little more refined in our tastes and feel that
you don't have mathematics unless you have
formal proof, then we have proof by
deductive reasoning, which was started
by the Greeks... And Thales, 600 BC,
the flyer-- the person in the upper
left-hand corner of the flyer-- is some artist's rendition
of who Thales was. He is the one
who is credited with first giving
logical proofs. Now, a logical proof in
mathematics is airtight if done correctly. You start with axioms,
postulates-- you start with statements that
everybody believes to be true. Given two points, there is
exactly one straight line that goes through them--
that is an axiom. You start with
definitions... "A circle is the set
of all points in the plane "a fixed distance
from any given point." And then, you use logic to
deduce other propositions from those definitions
and axioms. That approach is--
I mean, the idea that you have
to prove
things based on previously
established results or previous definitions
and axioms-- that is still
used today. The Greeks set
the standard for that. And the geometry
that they did-- most of their work
was in geometry. They did very little
with number theory and even less
with algebra... uh, a little bit
with trig. I wanna say something
about that in a moment, but their
geometry-- all the geometry that you
learned in high school and if you had any
geometry prior to some
non-Euclidean stuff-- all that geometry, the Greeks
knew 2,500 years ago. They knew all of that
and a lot more. The trigonometry is interesting
because it was astronomers who started the study
of trigonometry-- it was essentially
an outgrowth of-- an extension
of geometry, and they were trying to
figure out how the earth moved around
the sun. Now, this is
3rd century BC. Greeks knew-- or at least
some of them knew-- that the earth moved. The earth moved
around the sun. It was not something
that was born in Europe in the 16th
or 15th century. It's actually much
older than that. In terms of who was doing these
things-- well, Pythagoras. We all know
that name. It's the Pythagorean Theorem,
named after him. He may have proved it--
no one knows for sure. It's fairly certain that
the school of philosophy that he founded-- someone
in that school proved it. That school was also the first
to come up with the idea that there is such a thing
as an irrational number-- a number that is not simply
the ratio of two integers. As far as the Pythagoreans
were concerned, that was a
huge finding. Euclid is not necessarily
as common a name, but his "Elements" is the most influential
mathematics text ever. It's been in print
for 2,400 years. It basically-- if you
studied mathematics from the time of Euclid until
into the 19th century, when you started studying
serious mathematics, you studied Euclid. Lincoln
studied Euclid. Newton
studied Euclid. All the big names
studied Euclid. And Euclid based his geometry
on that axiomatic method with deductive reasoning, and it was considered to
be such an excellent work that it was viewed
as infallible. It was almost as the same
level-- in the Christian era, it was almost at the
same level as the Bible. You know, people
who know the Bible and will quote it
by chapter and verse? People who
knew Euclid-- which was anyone
versed in mathematics in medieval Europe
and later-- they would quote
Euclid, as well. "In Book I, Proposition 47,
we find--" and by the way, that's
the Pythagorean Theorem. (audience laughing)
It was revered. Well, somebody has
to know this stuff. Archimedes--
we'll probably say a little
bit more about him later, but for right now, you
have to say that Archimedes was 2,000 years
ahead of his time, in terms of what he was
able to do with mathematics. Not only did geometry
and number theory-- he did some very amazing things
with some calculus ideas. One of the greatest
mathematicians of all time. His ideas predated those of
Europe by about 2,000 years. And if you've ever
studied conic sections-- those got brought
up recently. Yeah, conic sections--
uh, ellipse, circle, parabola, hyperbola. The Greeks
studied those, and Apollonius did
significant treatise on those in the 2nd century BC. Conic sections played a
very important role later with astronomy and the orbits
of planets around the sun. Meanwhile, from 400
to 1200 AD in Europe, nothing was happening. (audience giggling)
Nothing. I mean, they call it
the "Dark Ages" for a reason. If you're looking for anything
of significance in Europe between 400 and 1200 AD,
there's nothing there. You're looking in
the wrong place. Maybe go looking
in the Middle East. Mesopotamia,
the Babylonians-- the prophet Muhammad founded
the religion of Islam in the 7th century AD. By around 700,
the followers of Islam were basically heeding
the call of that religion which, at that time,
was very strong on "We want to
seek knowledge, "we want to seek truth,"
and so, there was a... a tremendous calling
among scholars to learn for the sake
of learning. And so, because of that, um--
(cell phone ringing) Middle Eastern--
I won't do what I do in class, but-- Middle Eastern
mathematicians acquired ancient geometric
texts from the Greeks, translated them
into Arabic. They took the numerals from
India-- the Hindu numerals-- and now, we call them
"Hindu-Arabic," and created what
we would now call "arithmetic with
Hindu-Arabic numerals." You know, adding, subtracting,
multiplying, dividing, like we do on a
normal basis. That was done on a regular basis
in the Middle East by 1000 AD. This word
right here... uh, "The Condensed Book of
Calculation of al-Jabr." al-Jabr...
"algebra." That's where
"algebra" came from. The word "algebra"
comes from this work by al-Khwarizmi
in 825. A tremendous amount
of algebra was being done in the Middle East
at that time. Now, you wouldn't
recognize it as such because it was very
geometric in nature, in the sense that you base
things on geometric ideas, and they didn't have
the notation we had. But if you look carefully,
a lot of what we do today, a lot of the factoring we do--
you know, squaring binomials, cubing binomials,
all that stuff that our students sometimes
don't do so well with... they were
doing that. And if you are of
a literary bent and you're familiar with
"The Rubaiyat of Omar Khayyam," that is the same
Omar Khayyam-- he was also
an excellent mathematician and did some work with third-degree
polynomial equations-- cubic equations-- that was not surpassed in
Europe until about 16th century. Interesting thing about this
is there is a huge amount about Middle Eastern
mathematics from this time that is virtually
untouched. There are mosques and
palaces and basements filled with manuscripts that
have never been looked at. And so, the contribution
from Middle East at present is somewhat known, but not even
close to known completely. Well, okay,
so, 1200 AD-- eventually, Europe
started to wake up. The Hindu-Arabic
numerals that we use started making their
way into Europe by around the
10th century, and if you've ever heard of
the Fibonacci Sequence... let's see, how
does that go again? 1, 1, 2--
what's the next one? >> Three.
>> And then? >> Five.
>> Just keep adding... >> Eight.
>> Okay. Fibonacci is known for that,
and that's wonderful, but that's really,
in a sense, trivial. What he's really to be
honored for is the fact that he was a very
vocal advocate of Hindu-Arabic numerals
being used in Europe. Uh, printed the
"Libre Abaci" in 1228 AD. And it only took about 300 years
for Hindu-Arabic numerals to become widespread
in Europe. I mean, if you
think about that, you look at the numbers
we use on a regular basis, you'd think, "Why wouldn't
you use something like that?" They were using Roman
numerals prior to that. And Roman numerals
are really cumbersome, hard to read, hard to
compute with, okay? They basically used Roman
numerals to record things with, in terms of the
mercantile class. But 300 years it took to
actually get universal adoption of Hindu-Arabic numerals
across Europe. And just to give
you a sense of how advanced Hindu-Arabic
numerals were considered, I'm gonna read you
something here. This is from
about 1450. "A German merchant had
a son whom he desired "to give an advanced
commercial education. "He appealed to a prominent
professor of a university "for advice as to where
he should send his son. "The reply was that if
the mathematical curriculum "of the young was to be confined
to adding and subtracting, "he could go to
a German university," but if he wanted to learn
how to multiply and divide, the only country where he
could learn that was Italy. It was high-powered
stuff back in 1450. (audience giggling)
So, I don't know if some of you wanna share that
with your 095 classes or not... (audience laughing) So, Europe begins
to awaken. Fractions had been around,
essentially forever. I mean,
in recorded history, fractions have
always been present. The Egyptians were using
fractions 5,000 years ago. Um, now, fractions
gave way to fractions that were a little
more predictable. Base 60 fractions. Babylonians used
base 60 numbers-- sexagesimal
number system. Not decimal--
sexagesimal. They used base 60, and they
used base 60 fractions. And we still
do today, right? That's what
this is. This is 57-60ths of a degree,
and then this is 48 seconds, but there are 3,600 seconds
in 1 degree. And so, this is 48-- these
are sexagesimal fractions. So, it's not like you've
never seen 'em before. But those were used pretty
extensively in Europe and the Middle East well
into the 17th century. Decimal fractions
as we know them-- um, Simon Stevin,
Belgium, advocated base 10
fractions using, um... a notation that we
might find a bit odd. We would use
this notation. Stevin's "Art of Tenths" from
1585 used this notation... Which, if you look
at it the right way, you're really talking about
negative exponents, aren't ya? 'Cause this is
one-tenth, right? 10 to the -1. This is 4/100ths,
10 to the -2. It's not a horrible notation,
but it's kinda cumbersome. But the important thing
was two things-- he published a book saying,
"We should all be using "these decimal fractions
and here's how they work." And at least
he had a notation. So, it was a step in
the right direction. Um, did not get much attention
for quite some time. It was one of those books
that got published and hardly anybody
read it... but then, along
came logarithms. Now, this will warm the hearts
of certain people in here who are of a
certain age-- I'm seeing some
smiles from some and others just blank looks
'cause you're too young. (audience chuckling)
Um, John Napier, Scottish, had the brilliant idea that you
could transform multiplication into addition. And the theory
is he got that idea from trigonometric
identities. There are trigonometric
identities that allow you to transform products
to the sums... and he basically figured, "Well,
you oughta be able to do that "for just plain
old numbers," and so he came up
with an idea that allowed you to transform
products into sums, quotients into differences,
powers into multiplication, and roots-- like cube root--
into division. And for those of you
old enough to remember, it wasn't a lot of fun but
it was a heck of a lot easier than doing it
by hand, okay? The problem with Napier, though,
was that he used a base that-- well, okay. This gets
complicated. He essentially used
logarithms of base 1-over-e. It wasn't exactly that,
and "e" hadn't even been
discovered yet, okay? But that's basically
what he had. They're very
difficult to use. With logarithms of that base,
as the numbers got bigger, the logarithms got smaller--
it was just weird. So, Napier and
Briggs met, and Briggs had the idea that
we'd use base 10 logarithms-- the common logarithms
that were in the tables that some of us used, and you have a "log" key
on your calculator-- L-O-G will find
log-base 10 logarithms. Briggs probably goes
down in history as one of the most
heroic figures in the era of computational
mathematics. In order to create "log" tables
that he knew would be accurate, he began by taking
the number 10 and extracting 54 consecutive
square-roots of it, by hand, to 30 decimal places. It took him a couple
of months to do that. And then, he built
his logarithm tables meticulously to
14 decimal places... and I say "decimal places"--
if you're dealing with "log"s-- they're base 10 "log"s, you
gotta use base 10 fractions, you want to use
decimal numbers. Simon Stevin's decimal numbers
really didn't catch on that fast because there wasn't
a need for it, but as soon as you had
logarithms in play, it was like, "Oh, my gosh--
we need decimal points," and that's when decimal
numbers really took off. Now, symbolic algebra-- that's
the algebra we all know, okay? Algebra was not like
that until very recently. Algebra started out
being rhetorical. Rhetorical algebra was common
in Europe and the Middle East for a very
long time. Here is a statement in
rhetorical algebra, okay? Some of you can try this on
in 095, 6, 7, or 8, all right? "In the rule of three,
argument, fruit, "and requisition are
the names of the terms. "The first and last
terms must be similar. "Requisition
multiplied by fruit "and divided by
argument is produce." (audience giggling)
Okay? "The first and last
must be similar." Argument... requisition. Requisition multiplied by
fruit produces produce-- you're talking
about a proportion. This times this,
divided by that... will give you "P." It was called
the "Rule of Three." It was called
the "Golden Rule." Merchants use that all the time
to figure out how much-- you know, if it's this
much for how many, how much it'll be for this
many-- they use that a lot. But that was
rhetorical algebra. It was prosaic but it wasn't
easy to do anything with. Syncopated algebra
was a step up. Instead of using words, you used
abbreviations, okay, like this. This is from
Italy, 1494. Um, my knowledge of
Italian is vast... It's non-existent, okay? This stands for "cubo"--
this is X-cubed. "meno," "less"
means "minus." This is X-cubed minus--
"censo" is for X-squared. You got X-cubed
minus X-squared. This is for "plus." Uh, this is the most
interesting one. This is for "cosa." "Cosa" translates
into "thing." "Thing." What is the "thing"
you are solving for? That's the
variable X. So, X-cubed minus X-squared
plus X equals-- that's "(indistinct),"
I think-- zero. That's a
cubic equation. Okay? Um, what makes this
interesting is that "cosa"-- algebra became known in Europe--
see, this is what, 1494-- yeah, 1400s,
1500s, 1600s-- algebra was known
as the "cosic art." "Cosic art," because
you work with "cosa," you work with "things,"
you try to find "things." You try to solve
for the unknown. But, you know, this
is not our algebra. Our algebra--
the symbolic algebra that we're
familiar with-- really began taking off
in the 16th century, and by the middle
of the 17th century, it was pretty
much standard. Not completely, but
you would recognize it. If you picked up something
from the 1650s in algebra and read it,
you'd recognize it. Um, it's hard to say
who did what, when, because so much was going
on then, so we'll just say, "Many people in
Europe did this" between 16th and
middle of 17th century. Now that you've got algebra--
okay, so you got algebra, right? You got things like X-cubed
minus X-squared plus X equals zero. Well, then, you've
had geometry, right? Remember-- Greeks, geometry,
Euclid, revered, everyone-- anyone who studied
mathematics knew geometry. Now you've
got algebra. So, what's
the next step? Unite them. Now, I have to mention Oresme--
Nicole Oresme-- because he was so far ahead of
his time that it's almost scary. 1350-- that was the era of
the Black Death in Europe, and here is Oresme studying,
among other things, velocity-time graphs. Are you familiar with this at
all from anything in physics? >> Say it again.
>> Uh, velocity-time graphs. >> Sure.
>> But from this era. >> No.
>> He was doing
velocity-time graphs around 1350, which is about
300 years ahead of anybody else. And then,
he died young and people pretty much
forgot what he did. The two people for
whom we give credit for establishing
modern-day coordinate
system are right here-- Fermat and
Descartes. Both French. Both came up with their
coordinate systems at virtually
the same time. Fermat was a little
ahead of Descartes... there's probably a joke
in there someplace. (audience chuckling)
But I won't do it. I'm not Pruis. (audience laughing)
Um... it's an "in" joke. Fermat was first, but Fermat
was an amateur mathematician. He was a lawyer by trade,
and from what I've read, he wasn't a very
good lawyer. (audience laughing)
And you know why? He spent so much
time doing math. So, obviously, he's one
of my personal heroes. Um, he actually came
up with his idea first, but he didn't
publish anything. He just
didn't care. He didn't publish
a thing. He circulated
a few manuscripts. And actually,
Descartes got a hold of
one of those manuscripts, and he was just putting the
finishing touches on his method and published this right away
because he wanted credit. And okay, fine,
I won't get into that. I mean,
he published-- they both came over
essentially the same time. They had very different
takes on things. Um, Fermat essentially
was interested-- "Give me an equation, let's see
what we can find out about it. "Let's study the equation
geometrically." You know, like we do with
a graphing calculator. You wanna figure out
where that-- you know that equation
up there earlier-- X-cubed minus X-squared
plus X equals zero-- what are its
solutions? Graph it and look at
the X-intercepts. That's roughly what Fermat
was interested in doing. Descartes was more
interested in... "Give me a curve--
I wanna find its equation, "then study the curve
algebraically." His intent was "I wanna see if
we can study Euclid's geometry "from an algebraic
point of view." Okay, that
was the method. A very important idea
that underlies all this is that variables,
now actually varied, there is a huge difference
between this... and this. Equations like this had been
worked on in various forms. Babylonians did this
2,000 years BC. Find a number that you
can subtract 2 from in order
to get 10. This acts as a placeholder
and nothing else. That's all it is. It just takes the spot of
something you don't know. This-- within this context,
this has a continuous graph. It's a line. You know, it looks
something like this... I hope it looks
something like that. It looks something--
it's continuous. I mean, you can go from
point to point to point in a continuous manner, which means these variables
can vary continuously, and that was
a brand new idea. That was huge because that
actually gave rise to something really significant
in just a few years. The stage is set. In 1650 in Europe,
Greek geometry was known, symbolic algebra was
essentially as we have it today, Hindu-Arabic numerals, people were very
comfortable with them, logarithms were being used,
and we had a coordinate system. So, now, we
are ready for... enter "the calculus." Now, "the calculus"-- there
is a distinction to be made between this
and this, and I think the easiest way
I can explain it is simple. I can do this. I'm just a guy. High school students around
the country do this every year. You don't have to be
a genius to do this. I'm living proof
of that. But if you're
going to do this, you had to be
extraordinarily clever, because this is a collection of
rules, notation, and procedures that creates a system by
which you can solve problems in an organized
manner. "The calculus" is
a tool that applies to a wide variety
of situations, and you don't have
to be a genius to use it. These-- the people
who did this had to be
extraordinarily clever-- I mean, Archimedes was
a genius of the first rank, and to do what he did,
he had to be. Nobody else could do what
he did in 3rd century BC. He was pretty
much alone... because he would take
some really fundamental
calculus ideas and apply them in
extraordinarily clever ways, mostly geometric for him--
but for some of these people, it was largely geometric
and algebraic-- and come up
with results. These people down here--
these are all-- they did their work from
1600 to around 1660... and impressive results. Kepler, who
came up with his "Three Laws of
Planetary Motion"-- one of which is all
the planets orbit the sun in the shape of an ellipse,
with the sun at one focus. And in order to
deal with that, he also dealt with some
problems involving area. He used calculus techniques
to work with those areas. Fermat, who we met earlier, was
working tangent line problems. You know your calc
problems where you wanna find maximums and minimums while looking for a tangent line
that's horizontal? Fermat was doing
that in 1629... but he had to treat
each problem as if it was
a brand new problem. All these people did a lot
with calculus ideas but they didn't get
to "the calculus" until someone who may
have been sitting in on one
of Barrow's classes. Isaac Barrow was at Cambridge
and, in 1664 to 1665, he gave lectures
in which he demonstrated the geometric
connection between derivatives
and intervals. And for those of you that
don't know any calculus, he demonstrated geometrically
that the tangent line problem was the inverse of
the area problem. He demonstrated that, and he moved on because
he didn't see any significance. That was left to
one of his pupils. Isaac Newton... I mean, what
can you say? I mean,
there are geniuses and then there are
transcendent geniuses. His work in
the calculus-- his discovery of calculus took
place over a very brief period. He published,
essentially, none of that. He published
none of that. Some of that work was not
published till the mid-1900s. It's true. And some of you will glare
at me when I say this-- he was a top-notch
mathematician. One of the
greatest ever. I would not say he
is the greatest ever. He was the greatest
physicist ever. His fame, in my eyes,
lies right here. He published
this book, which is the most influential
book in the history of science-- no question. It transformed
science. Science became mathematical with
the publication of this book. Newton demonstrated "If you
wanna understand the world, "if you wanna understand
the universe, "I have some
fundamental ideas. "I'm not gonna tell
you why they work. "I feign no
hypothesis. "I don't know why gravity works,
but here's how it works." I mean, if you use these
ideas together with calculus, you can do a lot. And that transformed
science. But he didn't
publish much... and he wasn't a lousy writer but
he wasn't the greatest writer, and he used
lousy notation. His notation
was poor. He didn't care. If he wasn't writing
for the masses, he was writing for scholars
and for himself. So, Newton developed
his calculus, Leibniz developed his
a few years later. It is Leibniz's calculus
that we use today. Every time
you write this... which is just beautiful--
that's Leibniz's. October 1675--
I forget the exact date. Copious notes. I mean, you can
look at his notes. You can find out exactly when
he came up with this notation, exactly when he came
up with this notation. The common notations
used in calculus, those are Leibniz's
notation. And he was very--
you know, "the calculus"-- Newton really wasn't
so concerned about creating
"the calculus" that other people
could use. He wanted to use it but
he didn't really care too much if other people
could use it. Leibniz wanted other people to
know how to use "the calculus," and so, he actually
created a journal-- he founded a
journal in 1684 for the sole purpose of
publishing his calculus results. It's one way
to get attention. It was before
the internet. You couldn't just post
this stuff online yet-- had to have
a journal, okay? Imagine sending
a 140-character tweet, and "Guess what?
Today, I founded calculus." (audience laughing)
Um, now, there is-- if you know anything
at all about
the history of mathematics, you've probably heard
of the priority dispute between Newton's followers
and Leibniz's followers, and I wanna emphasize
their followers. Neither Newton nor Leibniz
started the priority dispute. Neither one did. But they were both
drawn into it. It got ugly. It's just-- it's a
sad state of affairs in the history
of mathematics. It really hurt English
mathematics because the English
mathematicians-- I mean, they were ardent
followers of Newton, and Newton was
harder to understand and had lousy
notation, and so, mathematical work
in England sort of stagnated for about 100 years. Meanwhile, in Europe,
things were really heating up. Now, I had
to include this. This is something I inserted
a couple of days ago-- we have to pause
for just a moment. 17th century in Europe is known
as the "Heroic Century "in Mathematics," and the best way
to describe that is if you were to compare
mathematics in 1600 to mathematics
in 700... a vast change
had taken place. European mathematics
in 1600, aside from the language
differences, look pretty much like
the mathematics of Greece. There was a little more
algebra, but remember, there wasn't much
algebraic notation. So, it was basically not much
different than Greek mathematics in 200 BC. By 1700,
everything had changed. You had the algebraic symbolism,
you had logarithms, you had Hindu-Arabic numerals
being used everywhere, you had calculus,
you had graphs. The emphasis
had changed. 1600-- mathematics was
almost all geometrics. 1700-- geometry was
still important, but it was definitely
in the background. Symbolic algebra,
symbolic calculus had taken the
foreground, and it was the
exponents of Leibniz who took the charge
on that. The Bernoulli
family... Jacob and Johann Bernoulli--
and there were others-- learned from Leibniz. Leibniz
taught them, the Bernoullis taught
L'Hospital and Leonhard Euler, another giant titan
of mathematics-- greatest mathematician
of the 18th century, no question
about that. Um, brilliant and
scarily intuitive. He just sort of said,
"Oh, this'll work." And then, he'd work out
a bunch of stuff and he was convinced it
was right, and he moved on. We need to come back
to that in a moment. But he was definitely
the chief proponent of "We've got this powerful
tool called 'calculus.' "We're just gonna
keep using it "because it's giving us
these amazing results." And the amazing results were
sometimes mathematical and sometimes they were
in the applied world-- physics, astronomy,
engineering. One of my favorite episodes
from this era is the discovery of
the planet Neptune, which was done not by taking
a whole bunch of telescopes and probing the night sky
for months and years. It was done mathematically,
using the physics of Newton, as improved by Laplace...
celestial mechanics. And the mathematics of the era,
which was calculus-- they did what's called
"inverse perturbation theory," and they essentially said,
"There's got to be a planet "out there that is
screwing up the orbit "of the planet
Uranus... "and we have
done the math, "and the planet will
be there at 9 o'clock "on this certain night"
in 1846, and they pointed their
telescopes and there it was. It's just--
it's astounding. So, the mathematics was
working marvelously well, and Euler was
their leader, okay? Now, that's great. If you're going to make
a lot of discoveries, you want to just
try stuff. I tell my students
that all the time. "Just try stuff!" Okay? But every once in a while, it's
important that you can prove that what you're
doing is correct. And after a giddy 100 years
of just flying with calculus to see what it would do,
we get to the 19th century, which was kind of a tumultuous
century in mathematics. Uh, it began by--
I don't know, "began" is-- this is not necessarily
purely chronological. Challenging truth. Truth. In this slide,
"truth" means Euclid. Remember, Euclid
was infallible. Euclid's geometry
was the geometry. If you're gonna do geometry--
it's Euclid. That's it, there's nothing
else to discuss, move on. Well... the parallel postulate
is one of those things-- remember, now it's a postulate,
it's an axiom, it's one of those things that's
supposed to be really obvious. You know, like, "Here's a point,
here's a point-- "there's
exactly one line." Euclid's parallel
postulate is very wordy, and you look at it
and you think, "Oh, this is something
we could prove." This isn't
an axiom. This is something you
state without proof. We can prove this,
and mathematicians
for over 2,000 years were convinced that they
could prove Euclid's postulate from other
postulates... that it was
actually a theorem. And they tried and they failed,
and they tried and they failed, and after about 2,000 years
of trying and failing, people began to get
the idea that, "Well, maybe
we can't do it." And some interesting
things began to happen. If you have Euclid's
parallel postulate, then one of the things you can
prove is that in any triangle-- when you add up all the
angles in any triangle, what do you get? >> 180 degrees.
>> 180. Now, that's Euclidean
geometry-- "E.G."-- okay? Gauss-- who more has to be said
of-- when he was 15 years old-- he's 15 years old and he begins looking at
the parallel postulate and saying, "There's something
about this that bugs me." And by 1870, which was
quite a few years later-- but when he was
quite young, actually, he had convinced himself that
Euclid's parallel postulate couldn't be proven and, in fact,
wasn't even necessary. That you could have other
parallel postulates besides Euclid's. He didn't publish
any of this, because it wasn't
good enough for him. Gauss had probably the highest
standards of any mathematician well into the
20th century. But other people were thinking
about the same things, and hyperbolic
geometry, um-- here, hyperbolic geometry,
unpublished was Gauss. Published--
Lobachevski and Bolyai. They came up with
a parallel postulate which said that it's possible
to have a triangle with a sum of the angles
instead of equal to 180, less than 180. Now, if you're saying,
"That's not a triangle," I can understand that, but it all depends on
what space you're in. If you're in Euclidean space,
this isn't a triangle. But if you're in
hyperbolic space, it is. And you say, "There's no such
thing as hyperbolic space..." Einstein found it
very useful when he was coming up
with the mathematics of general relativity. The space-time continuum--
curved space-- it's hyperbolic space. And then, Riemann
came up with... Fat triangles. Some of the angles are
greater than 180 degrees. That's Riemannian geometry
or elliptic geometry. Now, the point of
all this was... prior to the 1860s and earlier,
there was one geometry-- it was Euclid's. When these came out, first, they were viewed as
nothing more than curiosities. It's like, "Oh, you have
a very interesting geometry. "Isn't that nice?" And then,
people moved on. But by around 1870,
it finally hit home-- this geometry
and this geometry, as weird as they
might have seemed, were absolutely as valid,
as true, as this one. Mathematically,
there was-- if there was something
wrong with this, there was something
equally wrong with this or something equally
wrong with this. Or to put it
in another way, if this was invalid
and this was invalid, then so was this. So, either accept them all or
you don't accept any of 'em. And that became apparent
by around 1870. So, "Euclid equals truth"--
gone. Gone. Gauss... another candidate for greatest
mathematician of all time. Did not publish
nearly as much as many other
mathematicians did, but his motto was,
"Few but ripe." He didn't publish a lot,
but what he published was essentially
perfect. Rigorous, correct in every
detail, high standards of rigor. If you've ever done a fitting
of curves to a set of data-- least-squares analysis--
Gauss came up with that. Differential geometry, which folks in Calc 3 are
gonna start studying soon-- differential geometry--
that was Gauss's. Hyperbolic geometry-- Gauss didn't publish
but he knew about that. Number theory--
Gauss did tremendous work
in number theory. Definitely the
greatest mathematician in the 19th century--
no question about that. And believe me, the 19th century
was quite a century for mathematicians. Algebra got modified
quite a bit. You know that cubic equation
we looked at earlier? Are you aware of the fact
that, in Europe, in Italy, in the 15th-- er, excuse me,
16th century Italy, there were actually publicly
held equation-solving contests? I kid you not. That is not hard to find online
and in many other sources. But they only worked
with equations up to the
fourth-degree... because after
the fourth-degree, things just didn't
work very well. And Galois established what is
now known as "Galois theory"-- I mean, he didn't call
it that himself, okay? (audience chuckling)
Um, Galois theory, which essentially proved that
if it's fifth-degree or higher, there's not gonna be
a formula for solving it. You're just gonna have
to use ad hoc techniques. Um, something
a little closer to home, something we can all relate to--
Hamilton and Cayley. Okay-- oh, and
if you're in Calc 3 or if you are in
linear algebra and taking any matrix theory--
with real numbers, we all know this is true,
and what is that called? >> (all) Commutative property.
>> Commutative property. The order in which you
multiply doesn't matter, right? Both Hamilton and Cayley came up
with noncommutative algebras, both as they were looking for
ways to describe rotations in three-dimensional
space. The idea that this was not true
for certain areas of mathematics was...
stunning. It was like,
"You could do that?" And it didn't take long
for mathematicians to go, "Yeah, we can
do that!" And then, they just discovered
all sorts of other algebras. You know, you give
mathematicians an inch, they'll take a kilometer.
(audience chuckling) Because you have to
convert the units. (audience laughing)
Um... Euclid didn't equal
truth anymore. The parallel postulate was
shown to be unessential. You could use other
postulates which-- now, look,
for 2,000 years, mathematicians thought
that Euclid was infallible, and it turned out that
there were other geometries. And so, the whole idea
of, "Well, what about
other mathematics?" Well, what about
calculus? Is the foundation of
calculus infallible? Well, Joseph Fourier did
some work with heat transfer that involved
trigonometric series, and the short story on that
was what he came up with did really weird stuff
that nobody could explain. It was like, "What--
what are you doing?" And he said, "Well, it works,
so I'm gonna keep doing it." (audience chuckling)
There were some real cracks in the foundation
of calculus. In the mid-1800s, the definite
integral had never been defined, limits had never been
carefully defined, there was confusion
about the relationship between continuous and
differential functions. People didn't know
what they were doing. And so, the arithmetization
of analysis took place. Very quickly what happened
was if you're gonna do limits, you gotta base 'em
on real numbers. Real numbers can be
based on the rationals, rationals on the integers,
integers on natural numbers, and so... Peano came up with his
axioms for establishing the entire set
of natural numbers. Dedekind came up with
his version of Dedekind cuts, which defined real numbers
in terms of rationals, which then could be brought back
down to the natural numbers. Weierstrass gave the final
perfect definition of what a
"limit" is, and if you've ever done
epsilon-delta proofs of limits, this is the guy. Riemann defined
the definite integral. Cauchy-- not as
carefully as Weierstrass, but Cauchy worked
with continuity and differentiability. And then,
of course, Gauss-- always the most
careful of anyone-- way back when had worked
with convergence of series. By 1900, there was,
supposedly, a firm, unshakable
foundation for all of analysis
for calculus and everything
related to calculus. That takes us
to 1900. So, now--
oh, I almost forgot. How could
I forget this? To-- okay, I was tempted
to infinity and beyond-- (audience laughing) that is the Hebrew letter
"aleph," subscript is zero, and that is pronounced
"aleph-null"... which is a symbol for
a certain level of infinity. So, I could have said,
"To infinity and beyond," which I believe was a very
popular phrase in a movie of some time ago, okay,
but I typed that up. Um, Cantor-- Cantor developed
his theory of sets to help bring solid
structure to this edifice that we call
"calculus." He overthrew Greek thought
of more than 2,000 years, which was "potential
infinity is fine, "but there's no such thing
as actual infinity." The Greeks, with maybe
a few exceptions-- there is some evidence
that Archimedes might have accepted
actual infinity, but that's tenuous
even now. For the most part, Greeks felt
that infinity was potential. You know, one, two, three, four,
five, six-- how far can you go? "Well, you can go
forever," you know? "I can name a number
bigger than you." "Oh, yeah?" "Yeah-- a million."
"A million and one." "A billion."
"A billion and one." You can keep going...
but that was potential. They accepted that. What they did not
accept is, okay, all of the natural numbers--
there is this set that contains all the natural numbers
and here it is. There's the set of
all natural numbers. It can be thought of as
an actual collection. You need to give that
back when you're done. (audience laughing)
There's an actual collection that they'd, "No,
there's no such thing." They do not believe
that was possible. Cantor not only embraced
this but ran with it, and if you want to learn
more about this idea that some infinities
are bigger than others-- in fact, some are a lot
bigger than others-- our October seminar, Kelly Rozin
will be speaking to us on infinity. So, I have a feeling Cantor's
name might come up again. So, now, okay,
as I was saying, it was around 1900 and now we are at essentially
the modern, modern era-- um, you know, mathematics
of the last 100 years. That first bullet-point
is a gross understatement. Much, much, much more
mathematics has been discovered or created in the last 100 years
that in the previous 5,000. It is just mind-boggling
how much has been done. And so, you know, summarizing
that in two minutes-- no. But also the
problem is... most of it is pretty deep,
difficult, abstract stuff. And it's hard to say which of
that is really gonna be seen as important 100 years from now,
but since it's my talk... (audience laughing) I will now tell you
two things that-- when I'm thinking about
"How am I gonna end this?" I thought,
"You know what? "I'm gonna think
of a couple things "that I think will
still be significant "maybe, you know,
200 years from now." And I thought, "Oh, the first
two things that came to mind"-- well, here's
the first one. This is actually abstract
and fairly deep, but the gist of it,
a little bit simplified, is not hard
to grasp, okay? Kurt Godel-- in my opinion,
greatest logician of all time. Brilliant enough to be able
to walk around the grounds of the advanced-- the Institute for Advanced Study
at Princeton with Einstein and carry on conversations
at Einstein's level. They were very
good friends. I wish, I wish, I wish someone
had wandered with them every once in a while
and taken notes, 'cause it would have
been interesting. You know, maybe they did
talk about the weather. Maybe they talked
about other things, but I guess
we'll never know. Godel, in 1931,
essentially said, "You know these solid
foundations that you
think you've got? "It's an illusion." 'Cause here's what
he was able to prove-- if you take any formal
axiomatic system that's rich enough to contain
the natural numbers-- so, let's just use the natural
numbers as an example-- you've got
your axioms, you've got
your definitions. So, you create this system
for the natural numbers. It is impossible
to prove that that-- it is impossible to prove
within that system that that system
is complete. In other words, Godel showed that there
will always be statements in that system that you
can't prove within the system. Furthermore, he showed
that one of those statements that you can't prove
within the system is the statement that
"this system is consistent." If you want to establish
that the system of all natural numbers doesn't
contain contradictions, you can't do it
within the system. Godel proved that. So, the dream of creating
rock-solid foundations that are infallible--
it may be true... we can't prove it. Some of the people
in this room who are not in
the math department like to remind me of things
like this every once in a while, just to try
and keep me honest, and I tell them to go away.
(audience laughing) And then, this is almost
diametrically opposite. This is not
abstract. This is practical. You've got cell phones,
you're on Facebook, you use the internet--
heck, I use the internet. I don't know what
a cell phone is. I've seen them.
(audience chuckling) YouTube-- all that stuff
that involves communication from one place
to another-- that all depends on
transferring information
using ones and zeros. Ones and zeros. It's all about
ones and zeros. Someone had to come up with
the mathematical formulation which said, "You can do it,
and here's how you do it," and that was
Claude Shannon. Now, Claude Shannon is native--
he was born in Gaylord, went to the University
of Michigan, engineering and mathematics
Bachelor's degrees from
U of M, Master's in engineering
from MIT, PhD in mathematics
from MIT. The engineers claim him
as their own, but he has a PhD in mathematics,
so he belongs to us. (audience laughing)
And this paper-- this is-- this is, I mean,
a ground-breaking paper. Well, you're gonna use
your cell phone later? A couple of you have
been checking it in here while I've been talking--
that's okay. I'm not
glaring at you. Um, you're doing that
based on this mathematics. That made
it possible. So, whether you love the
digital era that we're in or you loathe it,
he's responsible. Without that mathematics,
you wouldn't have any of it. Um... that's about it in terms
of what I need to say. Now, if you want
to study further, these are all good-- um, my
favorite is probably this one. This is good but
it's higher level. This is good but
it's lower level. This is fantastic but it's
a tough read sometimes. If you want to know
anything about the history
of mathematical notations, you cannot do any
better than this. This is the source. You go online and you look
up "history of the plus sign," they took it from Cajori-- maybe
with or without attribution. And if you actually want a site
online that you can go to for history
of mathematics, that's out of the University
of St. Andrews in Scotland-- that's a good place. Uh, do you have any
questions about anything? Well, then we're done.
(audience chuckling) If you do have any questions and
wanna hang around, that's fine. Thanks for coming. (applause)