We've got a number which is apparently so famous that it's up there on the mathematical constants list. Up there with pi and e and root 2; but this one I'd never heard of until recently. And apparently- well they're pretty sure it's irrational so I'm just going to give you the first bit of it. This is Feigenbaum's constant, or one of Feigenbaum's constants, there's another one. If you find references to two of them-
I'm not going to talk about the second one which has a symbol alpha. There's a paper in, I think 1970 odd, by Robert May about a equation which I'm going to write down which models population growth. X, so it's traditional, and this is going to be like a population level. And it's a way of modeling like how things breed so maybe rabbits or frogs; let's go with rabbits because they breed like, yeah rabbits. It's going to have some other variables in it; this is what they call the logistic map. Math is like a posh word for a function, you get an input and output and it's mapping one value to another one. But what this does is it predicts what the next year's population is going to be for a community of animals. The first component here is the next year population,
so that means this one here is the previous. And so we're just we're kind of counting populations, they're going to be between 0 & 1 like empty dead population and a full population, and that's what this number is as well. So we've got like- we're going to iterate this, we're going to get an answer out and put it back in to
get the next year's thing. And then the important number here is this lambda. It's like the fertility, so if this is high these things are going to breed like like like rabbits; and if it's zero they're not going to breed at all. And changing this number is what the biologists were using to sort of model different populations. If the previous population is high you'd expect the next
one to be high; so you kind of multiplying by the previous year's population. But also if it's too high they're going to run out of food so that's what you do one minus the previous population. So this is like the breeding, and this is like starvation. What's interesting is you you can just put numbers into this; so I'm going to start with a population X1 of 1/2. So remember this is gonna- it's between 0 & 1, the population is half full to start with. In fact you can start with anything, what's interesting is what happens over a long time. So we need to choose a fertility, I'm going to choose - just
for this particular example - lambda the fertility to be, I don't know- it turns out you
need to be between 0 & 4, there are complicated reasons for that which I'm not going to go into right now, so I'm going to choose 2.3. So we're about middle in our range of fertility. So this calculation tells me that the
next year's population has gone up, I've got 0.575. So my population's increased, I've got enough fertility that the breeding is happening well. What's interesting though is what happens over a long time. For year three we get 0.562. (Brady: The population dropped?)
- Yeah the population's dropped. So we've clearly gone high enough that the competition started kicking in and maybe they can't sustain that high population. What's interesting though is if you you- if you wonder- is this going to bounce lots of times? And actually typing this in every time is extraordinarily tedious; although I know you're watching this for exactly that reason but we could
speed this up and you can do it on any of these calculators because it has an iteration function. If I clear this and I type 2.3 times answer, whatever the previous answer was is going to turn up there, and then 1 minus answer. And then I I never have to type anything in again, which is my idea of maths.
So next year - I'm not going to write them down anymore - but next year X4 we get 0.566, it's gone up. And if I keep pressing equals. I'm not going to read them out anymore but keep an eye on what's happening, I'm just going to keep pressing equals. 0.56494, 0.565, 0.56519, 0.56522, 0.56522, 0.56522, 0.56522; okay you know what's going to happen. It looks like actually- and this is believable right, sometimes populations stabilise. And this is good because this is the real world being described by a simple equation. This is not all that exciting I have to say but but at least I can do this calculation repeatedly, I can bash the button as many times as I like and it it's not changing anymore. It's genuinely doing the calculation but it's giving the same answer, it's what they call a fixed point of the iteration. So the rabbits are breeding, but they're also dying and they're dying and breeding at the same rate and something's balanced. And this is good because the biologists needed a simple way of modeling real populations and real populations do you balance out eventually. What's nice about
this is this map, or this function or this equation, is it also manages to capture some more complicated behaviour which is what happens if you change our lambda number. Which is what we should do now. I won't write all the calculations but I can show you on my calculator what happens if you change lambda. I'm going to do a lower fertility and I'm going to do this quickly because I think you might be able to predict what happens if
we have less fertile rabbits. So I'm actually going to reset my population to be a half, I'm going to put 0.5 and press equals and store that as the previous answer. I'm just going to type in now a pop- we'll do a lambda of- do you want to pick a lambda between 0 and 1, Brady?
- Can we have 0.65? We can have 0.65, I'll do 0.65. So we start with a half, after one year
we have a population of 0.1625. Your rabbits are dying. But if I keep pressing it, obviously I can now just keep pressing it to see what happens: 0.08, 0.05, 0.03 - this is it not looking good. I'm just going to go for it now, 0.0009, 0.0006 - eventually my calculator is going to say, look stop pressing equals they're all dead. In fact I'm now at zero. And maybe it's not quite zero, but it's off the end of the accuracy I care about.
- (Brady: I killed the rabbits!) Yeah. Well, again, useful to a sort of heartless biologist to model sometimes populations die out. What happens in the wild though is that sometimes populations die and sometimes they're stable and sometimes they do funky things. And and it kind of surprised them that
they could get the funky things. So one funky thing, if I pick a higher fertility. So I'm going to set up my population to be a half, 0.5, equals that's the first answer gone in. And then I'm going to use lambda is 3.2 this time, so clear this: 3.2 times previous population, times 1 minus previous population. And this time we started at half after one year we've got 0.8. Populations gone up. No big surprise, fertile rabbits. Next year, well there's lots of them now so you know their competition kicks in, next year 0.51. It's dropped way back down to almost where we started. Well, too many rabbits. You know there's not enough grass or the foxes are like wha-hey rabbit season and
they've ate- and I realise we're not counting foxes in this thing, but somehow this is capturing the behaviour the populations go up and then they quite often drop suddenly if there's too many. If I press equals again we're at 0.79, it's gone back up again. Definitely feels like we've got a bit of a bouncy behaviour; and it's believable. If I keep pressing equals: 0.51, 0.79, 0.51, 0.79; in fact I'm going to press it a few times, it might like take a bit of catching up but- If I slow down in a moment - 0.51304, 0.79946, 0.51304, 0.79946. It's not changing - except it is changing every year but it's not changing between these two values. And so it's stabilised, we've got what they call a fixed point but it's not just one fixed point we have a two cycle. - (Brady: A yo-yo fixed point?)
- Yeah exactly and if you drew a graph of it you'd see it bounce up and
down and it might be interesting to draw some graphs of this in a moment because if I- I'll draw you a graph. The first behaviour we had was- was rabbits doing this, stabilising. And then the next behaviour you had was rabbits dying. And then we've just seen something else can happen; and it happens out there in the real world, is that you start somewhere and it sort of bounces up here then down and then it- but it doesn't stabilise
ever, it keeps bouncing like this. Now the reason this map is being- this function is being used is that it's capturing even more complicated behaviours than just life and death. So we've got life but
high population - are too many rabbits so they kind of die off but next year there's loads of space for them so they grow again and then they bounce. I'm going to pick
lambda to be- this is a carefully chosen value, I'm going to pick it to be 3.5, let's try that one. Gonna set the population
to be 1/2, 3.5 is lambda multiplied by the previous population, multiplied by 1 minus previous population - this time of goes up to 0.875, 0.38, 0.82, 0.5 - wait, these are all different. I'm just going to keep pressing it but it's definitely not settling down, I'll show you I'll show what's happening on the screen. We've got 0.5, 0.87, 0.38, 0.82, 0.5, 0.87, 0.38, 0.82 - actually it is cycling but it's not two. It's now four. And so if I drew a graph of this one, this time it's going sort of down, up, down, down a bit, further up a bit further and then down, up, down bit further, up a bit further; so actually we've got a we've got a four cycle. At this point- like doing this on a calculator it becomes a bit of a waste of time, but what the biologists were doing with a computer was saying, okay even complicated behaviour like a four year cycle of populations can be modelled by this, with still just this one parameter, this one fertility thing. And that surprised them. What they didn't know was about to happen, was what happens if you go above 3.5. Because no one ever did, because it looked like it didn't do anything useful. If you change lamda beyond roughly 3.59, something strange happens, which we we should try and
see on a picture. I'll do a little graph here. This is going to be- this time it's not time, this is going to be the lambda axis, so I'm going to capture different values of lambda, the fertility, on this axis; this is still going to be population between zero and one. I'm going to mark this at four, and this is one. So this is the behaviour after it's settled down. So between 0 and 1- it turns out if you pick lambda between 0 and 1, eventually things die. As you as you nicely exemplified you killed all the rabbits, because they're not fertile enough. So after you've pressed equals thousands of times we're going to plot what happens.
- (Brady: So this is the Death Zone?) This is the Death Zone yeh, let me write
death here, because over time the population stabilises as zero. You chose 0.6 something and it was in the Death Zone so this is kind of your point here. When we did it earlier we chose 2.3 here which is about here but it stabilised to be about 0.56, think it was about 0.565 when we did it, so it's kind of just over half. So there's a mark on our graph here. And if you do this, and you could plot this on a spreadsheet if you wanted to, you see that the stability changes in this sort of pattern. And that different values of fertility give you different stable populations.
Interestingly it sort of levels off at about two-thirds, it looks like you can't have a stable population that's fuller than that according to this model. But you've already seen that something interesting happens, we tried 3.2 I think earlier, so 3.2 gave us a two cycle which means over a long time we're going to have two marks on this graph. And so I'm just going to sort of estimate they're about here and here. And the graph if you plot it bifurcates, which is a posh way of saying it forks. Like, they call it a pitchfork bifurcation. So this is; we've got dead, we've got stable, and then from about three we get a two cycle. But we have also seen that at 3.5 it settled into a four-cycle so then- you can kind of guess what's going to happen. If I put the four and they happen to be around here, you can see that somewhere it splits. And it bifurcates again, or it forks again. The posh word for this is- the period of these cycles, this is a one-cycle because it's got one value, this is a two-cycle
the period's two, and then it goes to four. And they call this a period doubling. What's intriguing is that by picking certain values of lambda you can get any periods you want, as long if it's a power of two, because it doubles every time. So if you want a sixteen cycle there is a value of lambda, - I'll let you go and find it - it's not much higher than what we used at 3.5 because; come and have a look at this graph again. The graph doubles again, and again, and again, and again; and quicker, and quicker, and quicker. So a tiny change in this lambda gives you a new branch. And it gets insane, it gets- it gets so you can't make any change even to the accuracy of your calculator and expect to notice the difference but massively different behaviour is happening. And if you recognise this sort of thing, this is this was being studied before the word chaos was used, but we now understand technically the word chaos - apart from just meaning mess;
it means technically something which is really sensitive to initial conditions. So this is a good example: a tiny change in our lambda gave you massively different behaviour. And from a modelling point of view for rabbits, this stops getting useful because it becomes very hard to predict what's going to happen with a certain value of lambda. But there is a final twist. Before I get to a final twist, which is what happens that graph further to the right, Mitchell Feigenbaum got involved in this at this point, in the 70s. And he was looking at this thinking, this is- it's surprising that a simple function does an interesting thing like this. And he started to study how quickly this doubling happened. So that that took a little gap, and this took a shorter gap, and this takes a shorter gap again; because it's going quicker and quicker. And he thought it'd be interesting to know if this is a sort of regular scaling thing; he didn't think it would be because it's doing crazy stuff, but he started dividing one of these these regions so like, if you divide that width by that width and then that one by the previous one, just to see the ratio. And he got this number, Feigenbaum's constant it's called now: 4.669. Now, in itself is quite amazing this has a nice regular behaviour, what it's saying is that this gap is about 4.6 times smaller than the previous gap. So the period doubling is getting quicker and quicker, it's a geometric series in fact. If you multiply that by
1 over 4.669 you'd roughly get this one. And if you keep going he managed to prove that this is the ratio it's tending towards. But he doesn't know what this number is and still no one knows actually what this number is - we don't we don't have a closed form of explaining what this number is. What is genuinely shocking is that he then looked at some other equations. Forget biology now, he just looked at other quadratic equations; if you haven't spotted already I'm just gonna point it out, this equation here is a quadratic because it has an x times an x. He looked at all sorts of other functions, all of them having a single sort of hump and this is a quadratic map with a single hump. And he found that every function he looked at not only exhibited this period doubling, if you started fiddling with the parameter, but it also scaled with that number. And it turns out now this is called Feigenbaum's constant because every uni-modal map for quadratic map scales like this. And no one knows what this number is, or why things are behaving like this but it seems to describe a whole raft of things which no one had ever looked at before. And there was this concept of universality, that even different situations are still captured by this number. So I'm going to recreate the diagram we had earlier on paper,
I'm just going to start that here. So we've got our lambda along here, and this little red dot which is now tracking across- we're in the dead zone here. This is where population is dead, but after 1 you see it spikes up. So this is a stable population now. And it's going to go all the way to 3 before it does- you can you remember what happens it's going to split, bifurcate, at 3. So that's our two-cycle. And then I told you that it does something roundabout 3.4, just a little bit afterwards I'll show you that now,
splits again. That's our second bifurcation. But you know it's going to get quicker and quicker, this is what Feigenbaum found. So after this it's going to period double very very very quickly. But actually at a point which is about 3.59, it's not very far away, you can prove the period doubling must stop at that point, because that would be like the- if you add a geometric series up to infinity there's there's a limit. And so the prediction doesn't tell you what happens after that point, but you can see it. We can go beyond infinity right now. So at 3.6 the period doubling is going to stop and something else is going to take over, and just a little
watch of this. So there's a very strange behaviour. It's nice and simple and then crazy. And actually this is a good example of chaos as well. I'm going to zoom in on that region. So here's the two-cycle and then it's split again to a four-cycle and then eight then sixteen and then right there we've gone in what we call the onset of chaos. And it's a technical term; chaos meaning really sensitive. But it's it's almost beautiful, there's weird structure happening here, this is not a glitch in my software - and yet it looks like for most of it it's spitting out random.
So these are the populations, every time you press equal it's creating a dot.
- (Brady: Does this get) (crazier then like the Mandelbrot set?)
- You're right to spot that, the Mandelbrot set is involved with this because this is a fractal. If you zoom in on stuff, so if I zoomed
in on this period doubling bit you could see more period doubling, it's self-similar. But then the self-similarity seems to sort of go a bit crazy. Actually this is the Mandelbrot set, but you're only seeing one dimension of it. The Mandelbrot set, as you may know, is all about using complex numbers, you need two dimensional numbers. If you just looked at the central line of the Mandelbrot set and plotted what the points do when you do the iteration this is what you're seeing. And actually this was happening the same time as Benoit Mandelbrot was discovering about the beginnings of fractals and chaos, so this is all tied together. What's lovely about this is that even in chaos you get order turning up. So this here is a three-cycle. It's going crazy over here, and then suddenly it settles down to stable behaviour, three cycle, in fact it doubles to a- you
get 6 and then you get 12 but then it goes back to chaos again. And nobody expected this behaviour to happen from a really simple equation, it's just the same equation we've been using all along. And it was the beginning of chaos theory, what they call non-linear dynamics now, where simple deterministic equations have very unpredictable results. And one massive application for it is if you want a computer to generate a random number - you can't.
Like, you tell the computer to pick something random and it goes, uhh 7? No it doesn't
have a 'uhh' function like we do, although we always pick 7 it seems. But if you wanted a computer to spit out a random number you have to pretend- you have to tell you what to do because it's a computer; but what you could do is tell it to use this function, press equals a few times near the end of our diagram and it would spit out a number which appears to be random, they call them pseudo-random numbers, but this formula was one of the first ones they ever used to make what we now call pseudo-random numbers. If you've enjoyed this video you might be
the sort of person who would like the Great Courses Plus. This is a huge on-demand video library starring the best lecturers and teachers from around the world. Joining up gives you unlimited access to this vast library of really in-depth meaty courses about all sorts of things. Including some brilliant ones about mathematics and, for example, this one all about fractals. But whatever your passion or interest the Great Courses Plus is going to have something, probably many things, up your
alley. Just lately I've been watching this particular video all about the early days of aviation because, well, I love anything to do with planes. To have a look go to TheGreatCoursesPlus.com/numberphile, the link's there on the screen, and there's also a link in the video description you can just click on. Using the link will give you a free one-month trial with total access to the whole archive. So why not give them a look? It's a way of showing a bit of support for Numberphile but also a chance to do some serious binge learning. [Outtakes] Oi, what are you doing?
- Hello Audrey You heard about chaos and thought
that'll be exciting, yeh?
- Aww cutie You are delightful! Come on, out we go. Out, out!
This guy is amazing on numberphile, I hope he makes more videos.
I did my masters project on things related to this and it all came rushing back when watching it! Man I miss studying maths!
the wikipedia page is a pretty good intro: https://en.wikipedia.org/wiki/Feigenbaum_constants
Wait, is he right when he talks about chaos and the fact that tiny changes in initial conditions result in massively different behaviors, and he says tiny changes in his lambda lead to massively different behaviors? I mean, his lambda isn't an initial condition, it's a parameter of the system. The way I thought it worked was you had to change your x_n by a tiny bit and observe changes in the behavior of x_(n+something), all at a fixed lambda.
I'm really glad you shared this. I had never heard of this constant before and I'm now fascinated by it! This is probably why I love math so much.
Can anyone explain how a biologist would have gone about deriving or proving the equation in the video? What type of evidence/axiomatic base could have gone into the formulation of this model?
pretty cool. I'm taking a class on dynamical systems this semester actually, hopefully it will be good.
So.... what happens for lambda > 4?
reading up on this again, with new eyes. saw this in grad school and didn't understand it at the time
this is a 1 dimensional mapping - have people found cross-system/similar constants for higher dimensional fractals that one-dimensional mappings