So you remember in the past we did, we done two really big numbers in particular in the past. We did Graham's number and TREE(3) right? So if you remember TREE(3) was this number that was that was so big that, you know even just to prove that it was finite, there just wasn't enough time left in the universe to do that. You would have a - the universe would reset itself before you ever got to complete the proof. It's - it's just stupid. It's just - you can't even picture it. And Graham's number was so big that if you thought about it your head would collapse to form a black hole. So these were two really crazy numbers but what I really want to talk about today is kind of sort of crazy of crazy. I want to talk about a number which combines both and that's TREE of Graham's number. (Brady: I'm worried the universe is going to collapse on us) Well, you know, we need to worry about that. As you might remember from, from those original videos, both TREE(3) and Graham's number they, they both arise in sort of a sequence. Particularly let's remind ourselves what - where Graham's number came from sort of very briefly. To get to Graham's number we started out with something, which I'll call g(1), which was three arrow arrow arrow arrow three, right? Now this was a really big number, now just to remind you what these arrows were; if I have three arrow three that just means 3 to the 3. It's a fancy way of writing 3 to the power 3, which is of course 27. If I take a double arrow well that just means doing a repetition of the single arrow, in this case three times, so that would be three arrow 3 arrow 3, which is 3 to the arrow 27, which is about 7.6 trillion, I think it is. Okay? And then if you had three arrows, well that's just repeated double arrows, okay, so that's just repeat repetition of the double arrow. So that's 3 to the 3 to the 3 which is again 3 double arrow - 7.6 trillion, which is basically a power tower of threes. So 3 to the 3 to the 3 to the 3 and so on, that it's 7.6 trillion high. This is just stupidly big numbers. And actually the first sort of rung on the ladder of Graham's - Graham's ladder, the Graham's sequence is actually - it has four arrows. So it's just repetition of three arrows. So this is the first point on Graham's sequence, but then you go up right? It's Graham - what Graham told us to do was actually to sort of build this sequence up, you would then go to g(2) where the number of arrows itself was already this huge number. Okay, so you actually now have G(1) arrows, and the arrows as you can see from this, make numbers really really big. So actually now when we start putting really big numbers in the number of arrows, we get crazy big. And he carried on with this sequence until he got to g(64), which was this Graham's number. So you have this sequence, 1 - you start off with this, g(1), g(2); until - all the way up to g(64). And you can carry on right? This defines a sequence which you could call g(n), which is basically related to the one before by taking the number of arrows that you had before. So you relate it like that. So this defines the sequence. The TREE - trees also gave us a sequence. Now if you remember from that video the TREEs are basically this idea of you play a game with a, with a finite number of seeds, different types of seeds, and you try to grow trees. And the idea is how long can this game go on for? So if you have one type of seed then the game can last a maximum of basically one go. So TREE(1) is 1. TREE(2), so if you have two different seeds, is three. And then if you have three different seeds then the game literally goes on for - can potentially go on for absolutely ages. This number is massive, okay? It's a really really big number. Like, it's actually bigger than Graham's number, that's what I told you, okay, so we're going to think a bit more about that. But this is a truly truly gargantuan number and but of course you could carry on with this sequence. You could try TREE(4), TREE(5), TREE(6) - in general some tree n. And you could even think of course about TREE of Graham's number. So that's what we want to think about. Now, so we have two sequences here, right? And the real question I want to ask is is which sequence is better? So let's ask this question, let me, let me write down. [Hold music] So we've got these two sequences. We've got TREE of n and we've got g of n. So we got the TREEs and the gs, okay? And we can build this number TREE of Graham's number, right, which is basically TREE of g of 64. But I could also think about going the other way round. What do I mean by that? I mean, do the TREEs first and then do the gs. So in other words, I'm gonna start playing the game of trees first and then I climb Graham's ladder. So what do I mean by that? I mean take TREE, in this case of 64, and then evaluate that in Graham's sequence. So I'm doing it the other way around. So here I'm climbing Graham's ladder first - 64 rungs, and then I'm playing the game of trees. Here I'm playing the game of trees with 64 sort of seeds, and then I'm climbing Graham's ladder. So which of these numbers is bigger? Okay, which is bigger?
- (Brady: Oh I, can I have a guess? I got the impression from you in previous videos that) (TREEs are more powerful) (so I'm gonna say) (that one's bigger.)
- But that's got TREE in it as well (Brady: Yeah, but it's - you're giving the TREE less juice there) (You're only giving the TREE 64 juice, and there you're giving the TREE Graham's number juice) I think that's a beautiful way to put it actually Brady, and you are right. Okay, but let's let's actually explore this a little bit more. Let's do it with a much simpler, a simpler pair of sequences, which are basically just powers - so I can imagine a sequence where p to the n is 2 to the n so I basically take, take the 2 to the power of whatever number I'm interested in, And quadratics, okay, so I basically take - if you give me n, I give you n squared. So there's two different sequences. The - neither of these are anywhere near as powerful as TREE or g, but they'll illustrate the point. Now, as you would put it, in this case the exponent, you know, this one - has more juice, okay? This is more powerful than a quadratic. Exponents grow more quickly than quadratics. In this case we can study and ask which is it better to do first or last? Okay, which is gonna take us to the bigger place? Let's look at this and let's let's start with maybe n equals 1. Okay, so if we do q first, we take the quadratic of 1, okay, that gives me 1 and then I have to do p, which is gonna take me to 2. Okay, so I'll get 2. If I do the other way round, so I'm going to start with 1, I then need to take its power, that's going to give me 2, and then I need to take the quadratic so that's going to give me 4. Okay, so you can see if I do for n equals 1, actually doing this guy last actually seems to win. Okay, that's, hmm, doesn't feel quite right. Let's try a bigger number. Let's do n equals 3. So if I take the quadratic first, that takes 3 to 9. And then I take the power, okay, that takes that gets me to 2 to the 9 which I *think* is 512, quite a big number. Now let's do the other way around. Okay, so we start off with 3, we take the power, so that's 2 to the 3, which is 8. And then we take the quadratic, 64. So you can see here the minute we went to a slightly bigger number, actually it was indeed advantageous to do the big guy last. Okay, and actually as long as you take n bigger than 2, it's always going to be better to do the big guy last. So it is, it is true. You're right Brady, the big guy, the most powerful guy, is the one that you should do last if you want to get some really really big numbers. So now let's go back to TREE and g. Now you - your intuition says that TREE is bigger. Its correct. But what's the mathematical way of sort of, of measuring that? Is there a mathematical measure for that sort of thing, for how fast a sequence grows? Okay? Does - is there a measure of how fast TREE n grows? Is there a measure for how fast g n grows? Or any other sequence? Ok, and there is. And for these growing sequences, the thing that we use is something called the fast growing hierarchies. So we're going to start off with a very - with a, sort of function that does grow but it doesn't grow very quickly. Okay, so really, really starting off with basics. And that's the successor function. So this is a very simple guy. It's basically just says take n and go to the next one along
- (Brady: Counting?) It's counting, basically. It's the first thing you learn at school, right? So 2 goes to 3, 3 goes to 4, and so on right? So it's growing from - it's a growing sequence. It just doesn't grow very quickly. Ok, but this is, this successor function is the basic, is the basis for all the - it's the seed for all these fast-growing hierarchies. Ok, so I want something that grows. Obviously, this doesn't grow as fast as TREE or g. I think we can agree that, right? Ok, so let's try and get something that grows a bit faster, but this is the seed for everything else. So what can grow a bit faster? Well you define f of 1, okay, which is defined as doing this guy n times. Okay, so I'm going to do f lots of times, in fact, I'm gonna do it n times on n. Okay, so what is this? So another way of writing that is just f 0 to the n over n. So what am I really doing here? I'm doing - I'm adding 1 to n, n times. Where's that gonna take me? That's gonna take me to 2n. 2 times n. Yeah, I'm adding n to - n to - I'm adding 1 to n, n times that's gonna take me to 2n. Okay, so you can see this has already grown more quickly. This one was just adding 1 each time, this one is now doubling. It's not a much faster growing function, but it is a faster growing function than the successor guy. Okay? And we've done it back, we've built it from that successor seed. Okay, so let's carry on let's try f(2) We're gonna define it the same way. We're basically gonna say do f(1) n times on n. Okay, so what's this going to give me? Well, this is basically saying multiply by 2, n times. So that's going to give me 2 to the n, times n. So now we've gone to an exponent, okay, so we've gone from successes, to multiplication - just like by doing repeating, repeated succession to exponentiation, by doing repeated multiplication. I can carry on, okay, what's the next guy? Well that basically says do this an n number of times. So, what's that gonna give me? Well, I don't want to write it down. This is going to be something that's more like the double arrow. This is actually going to grow more quickly than that guy. Just slightly, but it's the same - similar sort of sort of growth rate, but it's slightly more quickly than that. Okay, so this is what we called tetration. So you've got successor, multiplication, exponentiation. This is now tetration. And I could carry this on. Okay, I could just carry on defining it in the same way by repetition of the one before. Next one would be pentation, hexation, and so on - and so on and so forth. Okay? And I could do loads of these right, and I go up the whole time - one for every integer right. All the way up. Okay, great. Now, this gives me a hierarchy for - this gives me a measure for how fast functions grow. I can measure them against these guys.
- (Brady: So you could have f(100)?) f of 100, f of a googol, f of - f of Graham's number, even, right? Okay, and my question is is do these numbers grow faster or, or - which - where does where does the g sequence and the TREE sequence appear in this in this hierarchy? Is it, is it faster than some of them and not others, you know? Okay, what do you think? (Brady: Oh no, you've got a cheeky smile. So I think you're about to pull something on here. I don't know,) (I mean presumably because this f thing is infinite, could - it just keeps growing? It) (must get to a point at some point where I can use it to - ) So that's a good point. So it's certainly true that for a given value of n, so if I put say some large value of n in here, I could get a number that was bigger than Graham's number. But that's not what I'm asking. I'm asking do these sequences, do any of them grow more quickly than Graham's sequence or the TREE sequence? So for the same value of n, are they gonna give me bigger numbers? (Brady: I feel like you must be able to get there because) (because the way TREE works and the way Graham works is like set in stone.) (Whereas this thing, you can just keep making bigger and bigger until you - until it's fit for purpose.)
- The answer's no. None of these, even if I carried on all the way through all the integers, so all the natural numbers. There's never going to be a sequence that's as fast as either g, the g sequence, or the TREE sequence. They're both faster. So they're both, the TREE and the g, they grow quicker than any of these. Any of these and that's like, woah, that's - what am I gonna do now? Okay, so how can I measure them? Well, the problem is, is that you're just dealing with finite, okay. Look at what we have here. What have we built up here? So I kind of need, I think I need more - So you can see with that, with that sequence of functions, I'd have 0, f(1), f(2). Right, they all had an index 0, next one was 1, next one was 2 and 3 and so on right. And in principle I could have had any, any natural number ok. Arbitrarily large. But I was restricted to the natural numbers, okay, I can go one beyond. I can go to infinity. What comes after these numbers? Well, there's also something important to distinguish about these numbers is that they had an order. Okay, there was - they were, you know I have a notion of hierarchy. f(0) doesn't grow as fast as f(1), doesn't grow as fast as f(2) and so on. The indices were important that they had - they came in order. So you can really think of these numbers, these indices as more like ordinals rather than cardinal numbers. Cardinal numbers tell you how much, ordinal numbers add a bit of order to that. They add, you know there's first, second, third, and so on. Once you've you know, you've gone all the way through the natural numbers where's d'you go next? Well infinity, right? So I can talk about infinity, which I just - I write as the ordinal infinity, I write as an Omega. And that's basically the final - it's the thing that comes after all the others. All these, all these finite ordinals. The thing that comes after, is ordinal infinity. So that gives me a way to climb a bit higher. So I'm going to define a new type of function. Okay, that's labeled by this guy. Now, how am I going to do that? So I'm just going to draw some of the fs that we've already already created. Okay, so let me just draw a few of them. So f_1 of 1, ok f_1 of 2, f_2 of 1, f_2... Ok, so here are some of the, some of the fs that we've - we could carry on right, in all directions.
- (Brady: Where you've put an) (integer in the, in the place of the n?)
Yeah exactly. I'm actually going to look at their values now. Ok, so, you know if I go along this way, I'm sort of changing the argument, I'm increasing the argument. If I go this way I'm increasing the index. Okay, so let's just plug in some values for what these guys are. This is 2, 4 this is 6, so indeed I grow as I grow that way.
- (Brady: That was our, that was that doubling) So this guy is 2, this guy is 8, this guy is 24. This guy is 2, this guy is 2048 - bigger number. This next guy I'm not gonna write it out because it has 121 million digits. This guy is huge.
- (Brady: That escalated quickly) That did, that did escalate quickly as umm, what's his name Ron whatsit, can't remember his name now Okay
- (Brady: Ron Burgundy!)
- Ron Burgundy, yeah, yeah, yeah So this is gone out of control quite quickly.
- (Brady: Amazing thing is, you're only at f_3 here) Yeah I know.
- (Brady: That escalated really quickly, f can go like) (to the millions and you still say we're nowhere near the power we need to deal with Graham's number and - )
- Very very true, right, but let's, but what I want to do is I want to create create an f, so like - which I'm going to label with Omega, that grows more quickly than anything that's gone before. So, how do I do that? Okay. Well, what's the quickest way I can grow in this picture? If I start out over here, what's the quickest way to grow? Straight along the diagonal, yeah, right you go straight along. If you go this way, you get, well look in two steps, you're suddenly at 121 million digits. If I went to the next step, and I went to f_4 of 4, then I think this guy's somewhere between 10, three arrows, 4 and 10, three arrows, 5. It's in that interval. Okay. So going along the diagonal gets you really big, really quickly. In fact, it's much more fast growing than anything that went before. Okay, so that's what we're going to call f_omega. Something - the base of the guy that picks out the growth along the diagonal. So we can define that. So we just write that as f of Omega over n. We're going to define as f of n of n. That's basically saying pick out the diagonal. So this would be the first position, second, third - already 121 million digits - fourth - already this stupidness. All the way down there we're just gonna get crazy. We're gonna get a really, really, fast growing function. Okay, now what kind of function is, is this gonna look like? Well, actually you can show that this is kind of, generally going to be greater than something that goes like this, Where you've got n minus 1 arrows. So this is like kind of building up the arrows. Which might sound like something that we've done before, which is how we built Graham's number, it built up the arrows and you know, the index kind of went on the arrow. Which made you get really, really big. That's why these other functions; these, these guys were never able to do as well as Graham. Okay, because they weren't really hitting the index on the arrow. Okay, the arrow - putting it on the arrow makes it grow much more quickly. So this guy's doing it- it's doing much better. Okay, we can talk about our next ordinal, the one after Omega. Okay, the one after infinity, that's kind of weird. I'm just going to call it Omega plus 1. Right, now you may say what, what on Earth is that? That's clearly nonsense, right? You're talking about infinity plus 1. Well, isn't infinity plus 1 just infinity? That's not really true when you're talking about ordinals, depending on how you define it. What I really mean by this, I mean the thing that comes after what went before. What comes after Omega, Okay, so by definition it's not the same thing. Okay, you can't really do that with the cardinals, but you can with ordinals because the order matters. Okay, so I can, quite reliably, talk about something that is - that just comes after this ordinal infinity. It's the next one along. How do I get that eh? Well, I already know how to add 1 to my sequence of fast-growing functions. I just do it the usual way right? I do f of Omega plus 1 is just an application of this guy n times. Okay, so it's f of Omega n, n. Now this is starting to terrify me now because look how fast this guy grew. This was, this was, this was growing with, with putting basically index, indices going on the arrows. This guy's just got off on one now. This is even faster than that. This is, this is mad. I don't even want to yeah, I think you can relate this to these Conway chains. But I don't even want to go there about how big this is growing. But I don't need to stop! Okay, I could carry on. All right, I could carry on. I could just go, okay Omega plus 2, and so on. Right, I can keep going, I could keep defining these functions this way, right. Eventually I would run out of things to add. Well, what do I do then? Well, I just add Omega. Okay, so we're define an Omega, This - the limit of this I would then add something on top of that which was this, which I call Omega 2. It's a kind of next level of ordinals. Okay, but I don't need to stop then. I can carry on. I can do Omega 2 plus 1. And all the while I'm defining new functions this way. Okay Omega - what - Omega 2 plus 2. Keep going until eventually I run out of the finite numbers to add, and then I just say well the next one is gonna be Omega 2 plus Omega, which I'm gonna call Omega times 3. Okay, but now look what I've got. So all the while I've functions that're coming along for the ride, they're just going mad, right? Now, so what's next? Okay, so so what about I've built now, I've managed to get well, there's like an Omega times 1 which is just Omega. I've also got Omega times 2 and Omega times 3. Ok, so I can start building this sequence up as well. Okay, just by doing these, these repeated additions. Okay. Well eventually I'm gonna run out. What do I do then? Well, I say the next one along is Omega times Omega, which is Omega squared. Okay, so I can talk about a function which is growing like f of Omega squared, which is - ahh - I don't even know what it is. Right? It's mad. It's just gonna be something crazy. I can carry on building these Omegas. Omega to the Omega, Omega to the Omega to the Omega, Omega to the Omega to the Omega to the Omega.... Until I run out. Well, what do I do then? Okay well, then there's another ordinal which you call Epsilon-naught which is kind of like an ordinal infinity that's just - wow! It's just way out there. Way bigger than this guy, way bigger than any of these. And there's a, there's a, there's a growing function which is labelled by that as well. It doesn't even bear thinking about how fast that grows. Okay, you can carry on, There's a whole system of this, you can play with this play with this sort of technology all the way. You have Epsilon, you can define Epsilon-naught, you can build up more and more; Epsilon_1. You can even start talking about Epsilon with an index which is Epsilon- naught itself. And then you can have another index which is Epsilon, Epsilon-naught and you can grow, grow, grow, grow, grow. Your next thing is called, well the next one is called, sorry, a zeta number. And then there's an Eta number, and then eventually you just keep going and use these - these things called Veblen hierarchies, and all this. And then eventually you get to a point where you've got something which is just off the scale, massive ordinal. Which is called gamma 0. It's called the Feferman–Schütte ordinal. It's the largest thing that you can create using these kind of recursion methods, and this thing called Veblen hierarchy, which we ,go I'm going to - And the point is, is fs that can come along for the ride. Does it even bear thinking about what f of gamma 0 grows like ok? This is a crazily fast growing function right? I mean just, it was so far away we didn't even - we didn't even bother to describe it, right? Okay, so that's the million dollar question. Or the TREE three dollar question, right? Okay, where does where does the g sequence, the Graham sequence, and the TREE sequence lie in all of this? (Brady: So what what f, what function do we build that's gonna keep pace with TREE and - ) Exactly, which which of these fs can keep pace with either g or TREE? Okay, let me tell you. Let's do a g first. g we can handle, okay, g doesn't grow as fast as f Omega plus 1. It doesn't grow as fast as that. You can kind of see why right? Already at f of Omega we're starting to see the sort of index appearing on the arrows, which is kind of like how Graham's number is built, Okay, so Graham's sequence, umm, grows - doesn't grow any faster than this guy. Okay, what about TREE? TREE grows faster than this. It grows faster, it even grows faster than this guy. TREE grows faster than this. So none of these guys can handle TREE. The TREE sequence grows faster than all of them. Now, you can go beyond Gamma 0, if you want to. It all gets really quite messy. So we're not going to go there. Safe to say that TREE is off the scale fast growing. Even faster than this guy. Whereas we could handle g. So the answer to the original question was that yes indeed TREE does have a lot more juice than g. So TREE of Graham's number is bigger than Graham of TREE. Essentially. If you'd like some more from this interview, or maybe you'd like to watch some more videos about really big numbers, or you'd like to learn more about trees, check out the links on the screen and down in the video description. There's some stuff there we'd love you to see
Are you wearing an Apple watch? (15:34)
I love it when Numberphile tackles big numbers. I can feel my mind stretching as it tries to imagine and understand the scales involved here.
My favorite quote from the previous video on THREE(3): "Graham's number is effectively 0 compared to THREE(3)". Man!
My Mind = mind blownTHREE (3) .
the original paper is up for auction with proceeds going to the cause. I only saw this recently, would have loved to get the Parker square paper and frame it
How about slowly growing sequences? For example, log (n) becomes increasingly slow. Some sequences converge, like the sigmoid, but that's a bit boring. But I'm sure there's some intuitive categorization of how slowly functions grow?