Complexity and Gravity - Leonard Susskind

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Susskind is adorable. It's a sad world we live in which has but one Susskind for us

👍︎︎ 2 👤︎︎ u/rebelyis 📅︎︎ Aug 08 2018 🗫︎ replies

Lots of great lectures from this PITP summer school "from qubits to spacetime" in this playlist

👍︎︎ 1 👤︎︎ u/BlazeOrangeDeer 📅︎︎ Jul 26 2018 🗫︎ replies
Captions
well it's a pleasure to be here for my age it's a pleasure to be anywhere in fact I like to teach using the blackboard but I realized this morning that teaching using the blackboard here is a daunting physical task which I may or may not be up to I hope I'm up to it I'll probably need an eraser no no no no no no you know you're not understanding me yes so after will you'll be my eraser good ok all right oh oh oh oh I just realized I'll probably also need a hooker cancel that you can't soap beings wondering what I said you know you'll also be my hugger if necessary okay okay complexity is a hard mathematical subject then this most of you know I'm not the real mathematician by any means as I think it was Scott who said like I overheard his lecture at home I listened to it and he said that I said which I did say I don't do proof Scott and it's it's largely true I don't try to prove things but in this particular case I think it's sort of doubly true that it would be a mistake for me to try to prove anything I say for the simple reason is that nobody can prove anything about complexity practically what you can do is you can say I conjecture this and I can prove that that's the same as a conjecture that but neither conjecture can you prove and so complexity is a subject which is very very very especially difficult to to prove anything in and I think the same at this point that that you know I wouldn't move forward if I can't prove what I said would be very counterproductive it would be like saying I won't use QCD until I can prove that the it confines to this day nobody can prove that it confines and so if we want to move forward we have to use other techniques my own technique for doing physics has always been close your eyes try to visualize the answer and then say that's what it is no that's not really true what I try to do mostly is to find collections of visualize the phenomenon yes you don't have to close your eyes visualize they get into the bathtub visualize the phenomena and not just the phenomena that you're trying to understand whether it be mathematical or physical but to try to visualize a collection of different phenomena which are related in some way and which tend to reinforce the same view until you come to the conclusion look if this were wrong so many things would be wrong that I would have big problems and I'm quite serious about that and the other technique is the one that Sherlock Holmes always used which how did it go when you have eliminated everything that's impossible what remains must be the truth no matter how improbable for me the connection between black holes and and complexity was a highly improbable idea at first one which after thinking about it for years now I just feel like there is no other alternative it has to be right although I would have to admit the precise form in which it's right still remains to be given exactly what we mean by complexity and its connection with black holes is still a work in progress I think it's extraordinarily interesting work in progress because I think it has more to do with the interior of black holes than anything else I can think of well why should we be interested in the interior of black holes I think everybody here would say you don't really understand the black hole until you understand what its interior is and you really don't understand quantum gravity until you understand horizons okay so I'm going to start with today for the most part just discussing complexity I'm not going to discuss black holes they're all and I want to begin with why is complexity so important the subject in in quantum mechanics and I would say the reason is one that was pointed out by Fineman many years ago that Hilbert space is enormous Hilbert space is much much bigger than spaces of classical problems and so forth and I want to begin by quantifying that now I hope I can figure out how to use the blackboards let's start with K cube its K cube it's a space of states which is 2 to the K dimensional and let's write down a general state sorry equals sum from 1 to 2 to the K of a collection of parameters a collection of complex numbers alpha I I how many complex numbers 2 to the K of them now what I want to ask is how big is this space by that I don't mean how many orthonormal vectors are in it I mean how many states are there altogether well that's a stupid question you say there's a continuous infinity of them you have a continuous infinity of the alphas okay let me um try to regularize it by saying let's pick the alphas from some discrete finite subset of possible alphas that I could put in there not that I think that that has anything to do with quantum mechanics but just to try to make a counting if I were to regularize alphas and say that alpha any one of the alphas can be anything of the form alpha 1 alpha 2 let's say up to alpha em n complex numbers and ask how many states there would be then and the answer is pretty clear it would be M to the 2 to the K 1 M for each parameter 2 to the K parameters this would be the number of states when the core number s number of states is M the two to the K which is also equal to two Wow first of all it's interesting just to put in some numbers supposing M is for a very very modest number and let's suppose K is for K cubits for possibly and how big that number is it's about four billion okay it grows fast four billion possible states so let's take a logarithm of it which may be a little more manageable it is the log of the number of states which would be about 22 for the same numbers there that's equal to 2 to the K log M now this pattern here that I want you to remember the pattern is first of all this is only the logarithm but all right log log of the number of states through to the K log M it's still very very rapidly growing with K in other words very very sensitive to K but very weakly sensitive to M very weakly sensitive to the cutoff parameter or the way of regularizing but still very very rapidly growing with cake that's a pattern I want to I want you to remember let's try to do a more precise counting here's the way we'll do more more precise counting what is the space of states of a K cubed system by now I don't mean the Linnaean hilbert space I mean the manifold normalizable states Marvin out the you warm phase it's some CPN where in would be 2 to the K I'll put that in later CPM let's try to count how many points there are in CP n well there's again the continuous infinity so I've got to regularize it here's CP n let me put CP n over here these people will have them can you see this it doesn't matter because I'll say it there is CP in that's it looks sort of like a sphere I'm going to introduce a regularization by filling it with little epsilon balls an epsilon ball is a ball of radius epsilon how many and I'm gonna identify states with epsilon balls in other words the sort of coarse graining not distinguishing this state from this state from the state from that state and so forth ask how many epsilon balls are there in CP n as a function of epsilon we can do that calculation you can calculate the volume of CP n is the volume of CP n I'll just write it down volume of CP it's basically the volume of a sphere with the same number of dimensions the number of dimensions is twice n because n is a complex projective space CP n it it is pi to the N divided by n factorial keep in mind in is not the number of qubits it's just the dimensionality of the space the dimensionality of the CP and n is 2 to the K and will be 2 to the K minus 1 for K qubits ok let's compare that now with the volume of a epsilon ball let's call it as VB the ball that happens it's a volume of a ball of the same dimensionality of CP n in other words 2 n 2 real dimension 2 in real dimensions that just turns out to be pi to the N over the same n factorial times epsilon to the 2n again to in because the dimension the real dimensionality of the space is 2 n ok so the number of CP was the number of epsilon balls in CP then is just one of the epsilon number of I'll call at the number of states same as I called it up here some number of things I really mean the number of epsilon balls that is one over epsilon I think instead of twice two to the K minus Mormon I've used that in is two to the K minus one everybody know where that comes from the dimensionality of the Hilbert space is two to the k you subtract one for a reason I can't remember and and that's it okay again the logarithm of the number of states log number of states is equal to approximately equal to or of order I'm gonna drop this to here because it's not important to me 2 to the K log 1 over epsilon same pattern very rapidly growing with K very weakly dependent on the regulator ok so that's the pattern after that I want to call attention to now all of this the spaces are huge all of this sort of begs the question how in some sense big is the hilbert space in a different sense how far apart can states be in this space ok what's the maximum distance what's the diameter of the space in the sense of how far apart can states be and to answer that we have to have a clear idea of what we mean by the distance between States one notion of distances was my hooker am I doing this right Douglas good all right one notion of states is just the inner product distance distance between a and B is just equal to the arc cosine or cosine a are C cos arc cosine of the absolute value of the inner product that's the standard in a product metric okay with this metric the space is small its small in the sense that the maximum distance between any two points in it is PI over 2 it cannot be further than PI over 2 D max I'll just I'm not going to write it down D max is PI over 2 on the other hand that seems a little bit ridiculous let me compare two kinds of different states so I'll make up I'll make up our story I have a car it's an old beat-up Ford in fact it's pretty much just a rust pile now my wife has a car that's a brand new Mercedes Benz this is not true incidentally she has a car which is a brand new Mercedes Benz now her Mercedes Benz is are very up to date it has an ignition switch that consists of one cubit one bit one cubit see the on or it's off now don't ask me what happens to the car when it's an intermediate superposition okay my car was just a pile of dust let me ask for two different senses of definitions of the distance between states first is the distance between her car when it's on and off well it's very easy to make a transition from the arc car on the car or off you just flip one qubit that's easy so in that sense the states are close it's also easy to make a superposition of such states transitions superpositions and also if you wanted to to make a superposition of states and then measure the phase it's easier just a single qubit I think Scott talked about this a little the idea of relative complexity maybe he did maybe you didn't listen to him well enough I know I have a hard time listening to him know what I know I watched it okay so where was I yes the pile of the the pile of right now that compare that with the distance between her Mercedes brand-new and my Ford which is a pile of rust 1929 I think okay in some sense the pile of rust is much further from the new Mercedes then the Mercedes on or off or distant from each other but they the distance in the inner product metric is the same PI over 2 so that metric does not show in some sense the qualitative distance between states the other notion of metrical distance between states it's called relative complexity it's a loose it's a new subject that hasn't existed up till now people have not pointed it out very much how do we quantify the difference between the Mercedes and the old Ford okay you quantify it by the number of small steps we assume that in quantum mechanics from physics in general in quantum mechanics that if you want to change something from one thing to another you do it in small steps the idea of applying a unitary a single unitary operator to a Ford and making a Mercedes out of it that's too hard so you first change this little piece and then you change that you change that that brings us to the idea of quantum circuit okay so quantum circuit let's put a circuit here we'll assume all of the cars can be represented as collections of qubits time flows horizontally here this is a collection of qubits and how do you change our state a to a state B you do it in steps and for example the steps could be gates gates just mean little unitary operators which involve small numbers of qubits at a time and you simply apply them until you get to be I can write an equation for that let's write an equation for it the equation is that a is equal to G in GN minus 1 dot dot down to G 1 RB ok where G's are simple gates now in doing this you have to choose a collection of gates that are allowable you choose your collection of gates if you've chosen it well then that in fact if you just chose it more or less at random the gates will be universal in the sense that you can get from any B to any a by multiplying enough of these not quite you can get from any epsilon ball to any other epsilon ball no matter how small epsilon is okay that's called a universal collection of gates and now we can ask what is the minimum number of gates not the way you might actually do it but what is the absolute minimum yeah well I probably did but that's ok you can go from one to the other just by inverting G's so let's assume that when you make a universal choice of gates your gate set contains not only G but G inverse or G dagger that doesn't matter ok all right so that minimum of n is called the relative complexity of a and B [Applause] and again up to tolerance Epsilon okay now I want you to think of Atlas following way let's again draw our CP n I don't see CP n anymore oh yeah hissy PN and let's suppose that this is a and this is B then the sequence of gates is a path from B to a it's a kind of discreet path it's a discreet path from B to a but it's also the minimum also that the complexity the relative complexity is that by definition the minimums of such gates so you can kind of think of it as a geodesic on this space but a geodesic with the right definition of measure or the right definition of distance it's whatever this relative complexity is it's some kind of metrical notion I won't yet call it a metric but it's some kind of metrical notice notion of distance between states and relative complexity is kind of the geodesic distance of the shortest geodesic between them that's a theme that we'll come back to over and over and I don't believe it's the right way to think about complexity okay I think one of the reasons that people have not had powerful ways to think about complexity and in particular for connecting it with physics as not it has been an insufficient appreciation of its geometric character and that's what we'll talk about today okay [Applause] now let's talk about not States but let's talk about unitary operators unitary matrices on the space of states I'm gonna erase this I don't think we need it now unitary matrices mean to have two different incarnations if you like in what we'll talk about one of them is as operators so let's just write a unitary operator is equal to u IJ that's the matrix representing u times I times J that's you as an operator you also can represent a maximally entangled state of two systems all you have to do is write but the state is equal to u IJ I for one system and then J and it's normal to the conjugate it conjugate it means to basically the time reverse it okay so this read the same operator here represents both operators and so whenever we're talking about distance between operators were also talking about distance between maximally entangled states if this were Delta IJ then this this would just represent a product of del pam's partido pairs 1 1 plus 2 2 it was just a product of L pairs okay all right so there we are that's what a unitary is now let's talk about the distance between between two unit Aires U and V same idea there's the inner product distance first of all in a product distance is [Applause] Oh before we do that let's count let's see how big the space is the space of unit Ares supposing each matrix element now how many matrix elements are there there are 2 to the K 2 to the 2 K matrix elements the space is dimension 2 to the K the number of matrices is 2 to the 2k to make them unitary you have to put some constraints on it but the number of constraints is much smaller than the number of matrix elements and now let's suppose each you can take on one of a set of complex numbers the same M then this is how big the space of unit Ares is and to the 2 to the 2 to the K it's even bigger than the space of states we can do better we can do an epsilon ball job on it the same kind of epsilon ball camping I'll tell you what the answer is you first calculate the volume of the unitary group of dimension 2 to the K and then you calculate the radius the volume of an epsilon ball of the same dimensionality you divide them and I'm going to give you the answer let's call it the number of unit Ares number of units or number of unit Ares turns out to be 2 to the K over epsilon squared or to the 4 to the K over 2 huge tremendously large when K gets to be bigger than a few okay the logarithm of the number of unit Ares log number of unit Ares is equal to 4 to the K same fourth to the K oh this is of course M to the fourth to the K 4 to the K where is my answer times K log 2 over 2 plus log 1 over epsilon same pattern it depends ferociously on K and very weakly on the regulator ok so roughly speaking let's just call this 4 to the K unless you are especially interested in the epsilon dependence and we'll call the number of states 2 to the K okay what about the distance between unit Ares let's call it bu V again isn't in a product distance it would just be the inner product of the states the maximally entangled states that's just our cosine of trace of u dagger V whenever I use a symbol trace I will always mean normalized trace normalized trace means that the trace of the unit matrix is 1 I don't have to write a denominator there then so my definition of trace some people put a symbol over it but I'll just use trace trace means normalized trace this is the usual definition of the inner product or in the product distance and then there's the relative complexity of U and V and go to zap that's just starting with you or starting with V let's start with V multiplied by a product of gates simple gates from with young n n minus 1 back to 1 and see of UV means the minimum in they can satisfy this equation here minimum number of gates minimum number of steps very similar idea instead of a and B here just put U and V and again we can think of geodesics but not in the space now of States but in the space of unit Ares any questions it doesn't matter cuz I can't hear you anyway I'll try them no questions okay good yeah first of all the unitary this is a good question okay I've drawn circuits as if they were composed out of gates only involving two qubits okay that would be called - local - not not tillow local but TW o local to local I will always assume that a gate is a K local thing in other words it involves at most K qubits well little K where little K is much bigger than big K having said that I will probably wind up using little K equals to for almost everything I do if I were doing s YK I might do little K equals 4 or I might even little K equal little Q if you don't know s YK don't worry about it but gates are things which are made at a small number of qubits much smaller than the total number of qubits ok so that's the idea of relative complexity and it plays the same role as relative complexity of states very very different maximally entangled states will still only have an inner product which is at most PI over 2 so on one end with the inner product metric even this unitary space is in some sense small high dimensional but it has a small diameter whereas the relative complexity can be very large how large can the relative complexity be well we'll do a calculation at some point but I will tell you the relative complexity can be as big as 4 to the KFOR unit Ares and 2 to the K for States so huge these geometries defined by these two metrics are extremely different and we'll discuss how different they are but next relative complexity I'll take the case of unit Ares is a metric let's see what it means to be a metric that first of all means that C is greater than or equal to 0 that's true there's no way you can have a negative number of gates so C is greater than equal to 0 C of U V is equal to 0 if and only if u equals V this is true the only way you can have no gates connecting U and V is if u is equal to V it's symmetric now here I have assumed that if you have an allowable gate its inverse is also an allowable gate so that allows you to to solve for V and finally the triangle inequality C of U and V is less than or equal to C of UW + C of WV why is that true well one way of getting from u to V is to pass through the intermediate W if it takes a certain number of gates to get from u to W and a number of gates from the W to V then you can surely get from u to V by the sum of the number of gates and possibly a smaller number okay so triangle inequality these are the conditions that a measure of distance is a true metric and so relative complexity is a metric what else can we say about it it has a special feature of being what's called a right invariant metric anybody know about right invariant metrics okay anybody know about left invariant metrics we don't have it we don't have a mathematician who studies left invariant metrics yeah well just in case there is right invariant metrics are the same as left invariant metrics except they're right instead of left okay what is a right invariant metric okay so let's take this equation here where is it that um how do I write it W is equal to G in dot dot let's just call it capital G in capital G n stands for this string of gates here capital G in on V now you myself my dad you okay that's that equation written down here okay now let's multiply it on the right by W W is another unitary operator any unitary operator this says that if you can be connected to V by n gates then you W can be connected the VW by the same number of gates that says that the complexity of the relative complexity of UW and VW is the same as the relative complexity of U and V that's called right invariant invariant on the multiplication from the right let's see what happens if we multiply from the left alright this says right invariant yes right invariant check what about left invariant left invariant we multiply on the Left equals W well I don't want this w here now let me insert in here W inverse or W dagger W times V well W U is equal to something on WV but in general conjugating a product of gates by an arbitrary unitary matrix does not give a product of gates this object will in general not be a product of n gates and so it follows that the relative complexity of multiplication from the left is not only is not multiplication from the left is not an invariance of the relative complexity metric not ok any questions up till now yes say it again are you asking me how epsilon balls affect this two different matter the epsilon balls I will always take the B epsilon balls in the in the inner product metric that's a convention but there will always do it that way I'll come to it I'll come to it it does not mean close in the inner product and close in the in the product does not mean close in the complexity metric so the very incommensurate kind of things but we'll come to that okay my my conclusion out of all of this is that complexity has a geometry to it and that the study of complexity is a branch of geometry it's the branch of let of right invariant geometries on the space su 2 to the K not widely studied except by michael nielsen a little bit and by me and adam brown a little bit okay now I want to come to graph theory graph theory is a mathematical theory again it took me a long time to penetrate the lingo of it but I understand it now and I'll tell you how this is connected to graph theory let's see the fleet I'm going to screw up the blackboard technique let's see that's I guess I can go over to the last blackboard ah my eraser Thank You eraser ah also my cook what I'm gonna draw now is it looks like a graph but it's not the graph in the sense that I mean you're watching him they're not watching me he's more beautiful and I am for sure okay now I'm going to draw something it looks like your graph but it's not a graph it's a circuit alright so this is I'm going to define the standard architecture for circuits this is a definition standard architecture for circuits one two three four five you need I always want an even number of qubits what we do time goes to the right we divide things up into time intervals I'm going to call each time interval delta tau equals warn now I'm going to give you a little secret right now when it comes to black holes this time which is the time of the computer the clock the clock that separates intervals is a particular kind of time in in black holes anybody know what it is is its wore chill time it's Windler time and I'll show you why as we go along okay you know and the time is if not I'll try to tell you all right we've been right up the circuit in that way and the rule is in each time segment every qubit gets to interact once that means it's going to be K over to gates one date two gates three gates four six cubits one gate two gates three gates and so forth that's the rule but on the other hand any pair of gates any pair of qubits can interact this is called K local but all - all everybody can interact with everybody else but in small groups all right this is the standard I'll coalesce the standard architecture for four quantum circuits and notice we can count the number of gates that acts it's equal to the total time that elapses times K over 2 K over 2 times the elapsed time is the number of gates that's that's happened this would be called this would be called a depth 3 circuit depth means how many layers you have with 6 and K equals 2 that would be this here that would specify this kind of circuit now and this is not the graphs that I'm thinking about graph let me come to graphs first of all how many distinct kinds of one depth one circuits are there how many distinct kinds of depth one circuits the first thing I have to pick is which qubits to pair so the question is first question is how many pairings are there then you multiply that by the number of allowed gate types the number of allowed gate types that's small that's that's not going to cut a big make a big difference here the main thing in counting how many different kinds of depth one circuits are there is the number of ways of pairing you can fill in the details if you have a number of different gate types okay how many ways are there a pairing K things let's say ordered pairs anybody know the answer it's a K factorial divided by K over 2 factorial took me half an hour to figure that out Adam figured out in ten seconds this is the number of single step single step circuits and I'm going to call that all incidentally this is approximately equal by Sterling to the K to the K over 2 there's some ease and pies but they're not important there's very sub leading and not important it's essentially K to the K over 2 that's the number of pipes and I'm going to call that D little B little B is an odd notation but it's used by graph theorists to define the degree of at the graph the degree of a graph is a number of verdun umber of edges that comes out of each vertex all right so now here's the idea here's a co 2 to the K just abstractly don't pay any attention to it the geometry geometry is not important but we start at the center what is the center the origin the unit operator we start at Ari and now we start applying gates the come I'll say this the complexity not the relative complexity the complexity of a unitary operator is the relative complexity relative to I which just means its distance in some sense from the origin the origin being I okay how many choices for the first layer do we have well we have d of them let's lay them out here here they are don't pay any attention to where this actually falls into su 2 to the K it's just a decision tree which which of the possibilities will you choose for your first layer what I'm doing is I'm drawing a graph that represents all possible circuits ok how many of them are there there are D edge ends and that means d-unit Airy's have been gotten to now could some of these be the same you've gotten to the same unitary by different short circuits I'm gonna say no this is the principle of no collisions that you don't wind up in the same place and why should that be true the reason is because of the very high dimensional nature of the space the likelihood that two different circuits two different single-player circuits will get you to the same point is very low you can compute it I don't remember what it is it's exponentially small maybe it's doubly odd I don't remember for sure but it's very very small that two points will hit the same spot on su-22 the cake okay so that's first no collisions will adopt that as a provisional principle but not our permanent principle now let's go to let's go to the next layer we're growing the circuit a little bit each one of these points can now go to D minus 1/5 in this case and so it goes on and on and on circuit as a circuit grows as the circuit is allowed to evolve the tree gets bigger and bigger okay this tree is a graph that's what I mean by a graph how many assuming no collisions assuming no collisions how many states do I get to after depth two circuits well the answer will be d squared and that will be K to the twice K over 2 how about depth D let's imagine we go a depth D now we'll just replace this by D K to the DK over 2 now DK over 2 is nothing but the number of gates up to that depth K over 2 times depth D and what I'm going to say now is if they're true are no collisions then when you get to this point over here you have gotten to that point in the shortest possible way suppose there was a shorter way which meant for example maybe you got to it over here then you would have a collision between this point and this point so if there are truly no collisions we can up until a point where there are collisions we can just count the number of gates and call that the complexity because it will be the minimal number of gates that get you there how do I know that that is the assumption of no collisions DK over 2 then is the complexity of this circuit let's call it C which I will also write as e the complexity log K apart from this log K what this is telling me is that the number of states or the number of operators with complexity C grows exponentially with C in this manner if I just ask what is the number of unit Airy's with a given complexity the answer is it grows exponentially with the complexity keep that in mind that's all it's a fast growth ok so most unitary operators have very high complexity because it's growing so fast this is a clue about the nature of the geometry I want to compare this with something when you have a tree like this you can just forget where it came from and just think about the abstract tree and you can think about particles hopping around on the tree an actual quantum circuit evolving would correspond to sodding someplace hopping to someplace else hopping to someplace else happening to someplace else let's take an ensemble of these and take an ensemble of starting points su and we have an ensemble of circuits each one of them may evolve differently than the others they all start here and they grow and they fill up this tree incidentally the subject of random walking on a tree is a very well-developed subject here there are lots of theorems ok so we have this ensemble of things we have that the number of number of configurations of one of these particles on this tree grows like e to the C which means like e to the distance I maintain that this same formula represents you can think of a gas growing that C log K is nothing but the entropy of this great gas in the following way let's just write this differently let's write the volume called chloric the volume which means the number of states the number of states are the number of unit aires with complexity c grows like this if I solve this I find that the complexity is the complexity times log K is the volume of the configurations is log of the volume mouth sorry yeah log K times C is log volume this is very similar to saying that entropy is like the logarithm of volume in phase space so if I had such a gas on this space here I would identify the complexity or the distance from the origin as a measure of the entropy of those configurations at a certain distance this I think is an important thing as far as I can tell has been completely missed by complexity theorists that complexity is a kind or it's quantum complexity is a kind of entropy it's not normal entropy these are these are not orthonormal states here we're not doing all we're talking about something different come to what's different but it is a kind of entropy it's a measure of the number of states with a given complexity okay that is why there is a second law of complexity in fact just looking at this graph just look at this graph supposing you found yourself over here after two after two layers what's the most likely thing to happen next to move out or to move in by how big a factor large whatever whatever D is or D minus one so there's a very strong tendency to move out why because just a number of states grow so rapidly with complexity so the tendency for complexity to grow is just this fact that the number of states where the given complexity grows fast and that's very very similar to the second law of thermodynamics where normally entropy is the logarithm of a volume of phase space the bigger the entropy the volume of phase space grows exponentially with entropy so I've hopped on that because I think it's a very important point good ok any questions about anything up till now yes [Music] what's up I think there's very little literature about this but of course there is a lot of literature about the relation between groups and graphs like this is Kayle Kayle groups of what say it again we're gonna talk about that we don't talk about that we wouldn't talk about that is there literature about it I suspect there is I can tell you it for me it's been a nightmare to penetrate what literature there is on this it's all in this math speak lingo that I don't speak and so I've had the mostly sort of the seat-of-the-pants guess my way through it and that's what you're getting yeah there's the notion of there's the notion of complexity capture know the complexity of an operator is the minimum number of steps well it's K over two times the minimum times the minimum depth of a circuit that will get you to that unitary minimum okay with the assumption no collisions then the unitary that you get to over here you've got to in the fastest possible way so the complexity at this let's call these the leaves the leaves of the trees are the endpoints of the trees at the leaf of that each tree there is a unitary right that unitary has a complexity and that complexity is just the distance along the tree to the point the graph distance times K over 2 I'm having trouble hearing I'm about to tell you that the difference is huge and let me tell you in what way there are two extremely different geometries on the same space okay let me draw I'm gonna describe this by drawing a picture no equations just pictures first of all we have one geometry which is just a it's an abstract decision tree and it's just that's just a tree you know what I mean it's just a tree and it's an abstract space and this space has the quality of growing exponentially as you move away and has the character that the that the complexity relative to the origin here is just the distance multiplied by K over two now here is su 2 to the K it doesn't really look like a sphere but I have no other way to draw it and let's map it we start here at the origin let's put the origin over here and now we go one step we're fairly likely to get to something which is almost orthogonal to the original you will go a fairly long ways on us you too unless you feel to the K so we might go out to here it's only one step but still we might go out a distance which is pretty much the diameter of the whole su to the k space the other one will go someplace else this one will go someplace else and this one will go someplace else now we go to the next this goes here this goes here that goes here this goes here now keep in mind this is a very high dimensional space what I'm drawing here is it best the cartoon but still it'll give you some feel okay well it's first of all true is the way su2 gets filled in is extremely fractal I think this is what you were getting at back getting at it's extremely fractal it depends on epsilon but but let's suppress epsilon dependence from it it's extremely fractal and it fills in in a very much like or irregular way and in particular two things can be very close with respect to be in a product metric and be very hard to get to from one to the other through a small number of gates that's particularly true if the if the small distance corresponds to e to the I epsilon times a very very large weight operator weight up a large weight operator means a lot of qubits if for example to go from here to here required an operation which involved a lot of qubits then although these could be very close in in a product they can still be very very far in in the complexity space in that sense these two spaces or these two metrics are very incommensurate I don't know if that's the right word mathematical word but very very different okay you were puzzled you're asking about the universe in our universe are okay right so there are two kinds basically two kinds of ambiguities one is additive this is a guess to some extent I think probably that's probably good yes but I think there are two kinds of ambiguities when you come out to start actually trying to calculate complexities one is additive and one is multiplicative the additive ambiguity actually has to do with epsilon Epsilon always comes in in an additive way log epsilon always log epsilon times the dimensionality of the space there's an analogue of that for entropy classical entropy strictly classical entropy classical entropy has an epsilon dependence where epsilon means the cost graining of the phase space you also draw little epsilon balls and statistical mechanics otherwise you get infinite entropy and every state or every configuration has an entropy which is logarithmically divergent than Epsilon well you fix epsilon but it just comes in as an additive constant into the additive constant proportional to the number of degrees of freedom here same thing there's an additive constant log epsilon which is epsilon dependent but doesn't affect for example the way complexity grows or differences of complexity and then there's a multiplicative ambiguity the multiplicative ambiguity has to do with for example supposing I originally only allowed two qubit gates and then I say oh I'm gonna now allow three qubit gates well every three qubit gate can be made out of some number of two qubit gates so by including three qubit gates you probably simply diminish the number of gates that you need but by a multiplicative constant namely something to do with how many two qubit gates it makes text to make an arbitrary three qubit gate so there's always a multiplicative ambiguity in the complexity and the narrative and we're gonna have to deal with that the Rebels we're gonna have to do and they have to take that into account but up to the additive and Collective ambiguity at least when the complexity is relatively large it seems that's fairly universal and I will take that assumption that's you know that is universal or close to universal okay so these two metrics have very different meanings very different shape and very different purposes you just remind you the inner product measure tells you what does it tell you it's the thing that you use in quantum mechanics all the time when two things are close and in the product measure it means all expectation values are very close so all you're interested in is expectation values you don't care about relative complexity on the other hand if you do care about how hard it is to make a transition from one state to another or how hard it is to interfere them then you do care about relative complexity so there for very different purposes they are very different shape and we'll keep that in mind okay now I said no collisions what let's see it's now true we have another half hour okay I said no collisions but of course there will be collisions you can see it from here eventually the number of leaves will become equal to the number of epsilon balls when that happens you can't stuff in any more leaves you can stuff in more leaves but they will occur on previously visited epsilon balls when you when that happens you will have loops in the geometry okay so let's see if we can estimate when that happens that will also define something else it will define the maximum complexity the maximum complexity happens when this thing saturates and you simply can't get any more things in without revisiting previously visited epsilon balls which are known to have lower lower complexity so let's see if we can make that estimate okay I lied I can't erase the blackboard I just wanted to see how they do it okay on the left hand side I'm going to write the number of leaves after a certain depth circuit that's equal that's up there somewhere as well this remind you where it is it is K to the complexity or the number of steps that's depth this is depth times K okay to the complexity that's the number of leaves number of leaves yeah when does that equal number of epsilon balls the total number of epsilon balls are just ready to gone down again two to the K over epsilon squared or to the power four to the K over 2 this is our epsilon ball estimate of the size of SU 2 to the K let's what am I writing here this is nonsense what I meant to write is that this will define the maximal complexity the maximal complexity will occur when the number of leaves equals the number of epsilon balls is that clear or should I explain it again that's just when you fill that up and there are just no more epsilon balls to visit no new epsilon balls to visit and now we can solve that and we find that the maximum complexity I'll write down what the answer is it's the most important thing is forth to the cake okay and then there is one half I think it's I think it's plus log epsilon divided by log K if I'm not mistaken the main thing here is this fourth to the cake what this says first of all it says that the maximal complexity is what a fourth of the K the other thing that it's saying or assuming not saying it's assuming that that loops don't occur or collisions don't occur until they absolutely have to now this is very closely related to the theorem that that Scott quoted that he and I wrote it well which we supposed some day to write a paper about about the rate of complexity growth at the rate of complexity growth is maximum this is basically the same assumption and this is what it says that the maximal complexity is fourth to the K now everybody who does the sort of thing will agree that the maximum complexity is fourth of it K I don't know if they'll agree with this level of detail here but there is a log Epsilon there is also a log epsilon will also agree that there's a log epsilon thus log epsilon is called a kitaev solve a theorem and in fact the kadai add the kitaev theorem is not as powerful as this it says there's some logarithm to some power here unknown power thought to be one thought to be one okay let's summarize something now that the maximal complexity and you can think of the maximal complexity as well that we'll come to it maximal complexity it's basically the maximum distance between any two points in the complexity geometry we can call it the diameter the complexity diameter the complexity diameter is of water 4 to the K what about the number of epsilon balls or the number of vertices so it comes to the same thing the number of vertices total number of vertices up to that point that's of order e to the 4 to the K the diameter is logarithmic in the number of vertices that we've come through thus far now what happens next next question what happens next what happens when you reach maximal complexity in other words when you reach the outer boundaries of the tree what happens next supposing you add another piece of a circuit you add another single layer of the circuit you've got to go somewheres you've got to go somewheres but we know that you've filled up all the epsilon balls you simply have to go to someplace that you visited before now this is not completely clean where you go from here will not necessarily sit right on top of an epsilon ball you'll wind up somewheres near some previous epsilon ball perhaps not exactly overlapping with it maybe sort of off on the side a little bit I'm gonna ignore that this is a fake but I'm gonna ignore it anyway to try to give you some feel for what the geometry is like so the next step will bring you around to some previous place it will bring you some place which is not close by well I'll come to that I'll come to why it can't be close by but it'll bring it to some other place this one will go to some other place over here this one will go to some other place over here and so forth and from there the graph will just continue and recycle previously visited sites okay so what do we know about this overall graph assuming that we can fudge and fudged a little bit if a point comes out close to an epsilon ball just give me the license to move it over to the epsilon ball to get some sense of what's going on okay I have another drawing over here that I like better the circle here is just guidelines for me the tree is the important thing you start at the center this is a tree of degree three but of course we won't really want a very high degree the outer boundary of the tree is where the complexity is maximal first of all you can see from this tree that the smallest it's gonna connect up here somewheres this point may go to this point this point could go to that point let's first ask could this point over here go to that point now the important thing about this diagram is that it's homogeneous every point is the same as every other point why is that because we're talking about a group space and yeah we're talking about a group space and so every point is really the same as every other point then notice this point over here has no small loops the smallest loop you could possibly have would be to go out to the full diameter and then come back there are no small loops which means that they can't be any small loops up here this is a fact that was taught to me sort of by Yama Douglas in some slightly different context I made the mistake of not realizing this you have to jump this one they jump to here this one they jumper here and so forth this one they jump to always around to here so you expect big jumps in this picture just so that there won't be small loops anywhere the smallest loop is expected to be about as big as diameter [Applause] what's that they're a good very good question very good question the answer is that almost all points on here are close to the maximal complexity now can you jump back to the origin yes but remember the total number of vertices is e to the 2 or e to the 4 to the K some monstrously big number and almost all of them are out near the boundary so it would be very rare and exceptional that you would jump anywheres except that somewheres near the boundary and we're going to talk about that ok let me write down what we know about this graph number one it's finite by assumption i fudged it so that so that when i come back to an epsilon ball i just stick it on top of the previous one and that that's a little bit of a fake but the geometry altogether I'll take to be finite it's finite its tree-like about any point finite and tree like now tree like means up to some distance until you get out to maximal complexity finite and tree like about any vertex it's d regular d regular this is 2 what does the irregular mean that simply means that every vertex has the same degree same number of things coming out of it just the number of independent unit depth circuits ok so it's D regular what else do we know about it I'm losing space here it's sparse sparse means the following spy what's it what's a not sparse graph a knot sparse graph is a complete graph a complete graph is one where every site connects to every other site okay here the number of the the and that means that the degree is comparable to the total number of sites that's called dense in this case the degree if you go back to what the degree was it's much much smaller the number of sites is doubly exponential the degree was much smaller than that and so the degree is much lower than the number of sites it's called sparse so it's a sparse graph what else its diameter well first its volume or the number of vertices the number of vertices is e to the fourth to the K or something like that that's the number of vertices on the number of unit areas all together in the space on the other hand the diameter is four to the K so the diameter is like the logarithm of the total number of vertices and finally another term the girth of the graph the girth is the smallest closed loop like my girth okay the smallest closed loop and that's comparable to the diameter girth also fourth to the King graphs like this have a mean they're they're very hard to understand the heart to build and they're almost all graphs the random graphs are like this more or less but graphs with exactly the set of properties are called expander graphs and believe me I struggled to find out what and expand the graph was because it's in the length and the language that I find very unfamiliar but this is what an expander graph is the subject of expand the graphs is all over the place in computer science but not for this reason okay so that's what we have here a geometry which is left invariant not sorry right invariant right invariant and incidentally we will talk more about right invariant metrics and if you can if you're ever interested in them and you want to read the best possible literature on them it is Vladimir Arnold's book called things just call classical mechanics and it's spectacularly clear if you can get it in English Ronald discussed the famous mad Russian mathematician Arnold how a Russian got named Arnold I don't know this first name is Schwarzenegger that's okay all right the last thing how much time do we have we have 15 minutes okay last thing I want to talk about is this notion of a second law of complexity and just qualitatively just qualitatively I want to imagine an ensemble of systems an ensemble can either mean you're doing statistics you have one system but you're doing statistics over where how it can move or you can imagine a gas of a large number of systems average over the gas would be the same as ensemble averaging the single system averaging over different ways it can move okay so the ensemble starts with all of its elements near the center now let's follow one of them as it moves out I'll use blue here it moves out and moving out its complexity increases from 0 to K over 2 next it has to make a choice where to go it could go back but that's very very unlikely for reasons that were said so the next place it will probably go is here the complexity is increased again let's draw a graph of the complexity with time this is a complexity for just a single element a single element of the ensemble the complexity has gone from zero decay over two to twice K over two and so forth and so on it keeps moving out with extremely small probability that it will move back in and so it increases linearly for some time then it gets out to maximal complexity and finds that it's got the revisit sites so let's say it goes around let's so let's pick let's pick this one over here I mean just just make it visually exciting let's draw it that's coming around to here and forget this one over here comes around over to here it may or may not hit an absolutely maximal complex but it will hit something close to maximally complex why because there were just so many more things which are maximally complex so we'll hit some other point over here it might decrease a little bit this point maybe be inward a little bit relative to this point so there might be a little bit of a decrease here but what will happen next in the next step it will overwhelmingly move outward again we'll go until it gets out to the maximally complex things but before we do that let's just ask what its complexity is when it moves around to here remember the complexity is not the number of steps you took to get there it's the minimum number of steps that you could have taken to get here and that would be more likely to look something like this this will be less steps than it takes to get from here plus here so all of a sudden something has happened the complexity is stopped increasing the trajectories of the geodesic so it's called the geodesics have changed instead of the following this geodesic to find the complexity we follow an altogether new geodesic and then we follow it a little further this will again move out their maximal complexity jump to some other place which is not a drug here and again the same thing will happen again the the minimal path to get to here might be this so first of all the complexity stops increasing it can't increase anymore it takes an exponential time to get there exponential number of gates so the time is some exponential of K the complexity is also let's say some 4 to the K up here and it simply wobbles around up here with the minimum trajectory changing from instant to into an instant to instant the whole structure just the whole structure that led to this linear increase breaks down and something wildly radically different happens eventually somebody pointed this out before eventually by sheer luck you might you might get to someplace where after a series of steps you get back to the origin now that's incredibly unlikely EE exponentially unlikely the amount of time that you would spend hovering around here before you would make one of these sort of freak reversals back to low complexity is the recurrence time in this case the quantum recurrence time which is not e to the K but of order e to the e to the K and when that happens you might come back and revisit the origin these are quantum plunk array recurrences of quantum occurrences and these just take forever but if you have one and you look at it you'll see two things you'll see an initial in which the complexity decreases pretty much following the time reverse trajectory here hitting the bottom are the constrained it doesn't hit the bottom easily I said suppose it does hit the bottom then it'll track back and go up again going up again will follow the same trajectory here so you'll find intermittently very rarely but intermittently structures which look time reversal invariant about the center here which go down and come back up the implications of that for black hole physics are fascinating and interesting and we'll come to it alright I think it's probably time for me to finish I'll take any questions over for the next 3 or 4 minutes if that's okay with one is it ok I think yeah I do have five minutes right yeah ok so yes yes the hyperbola much so I should have said that the the continue yes what a tree is like is like a hyperbolic geometry in so far is the way that it grows as you move out okay so if there is a continuum version of this which I think there is the continuum version of it would have a very hyperbolic character every point would be surrounded by a hyperbolic space but globally it would have this come back on itself behavior that's very hard to envision the whole thing is very high dimensional it has this character of being a expand the graph in the notes that I sent two to one there's a picture in there somewheres of some expand the graphs they're not very illuminating they just look like hash but they all expand the graphs and in fact one of them is a very special kind of expander which I think these are the more sparse state the more they satisfy these basic conditions that I wrote the stronger they satisfy them up to some point if they satisfy them very strongly in other words if they grow very quickly and have these properties and spades let's say they're called Ramanujam graphs they have far as I can tell nothing whatever to do with Ramanuja but they're just really really good anybody here knows about explain the graphs Douglas you mean you don't know about to explain the graphs I thought you did okay anybody like you a little bit okay um the most expander II kind of expander is kind of expand the graph is called a ram anujan graph it means it practically saturates the maximal growth rate which is the growth rate for a pure tree and in that case it up until the time it gets out to the boundary those are called Ramanuja I think these are likely to be Ramanuja graphs is Henry here and we thought I mean this sort of faked that the sort of them fudged Cayley graphs right so Henry has a very very beautiful construction which is not the true quantum complexity instead of unitary am I allowed to say this okay instead instead of unitary she studies a classical problem which is permutations on on computational states instead of unitary operations and studies the the exactly the same things which I've described here for the permutation group instead of the unitary group and there it really is a Cayley graph Cayley graphs those have those definitely have the expand the property right okay so we do know a little about it okay that's the next time black holes and then the third time back to the second law good [Applause]
Info
Channel: Institute for Advanced Study
Views: 120,250
Rating: 4.9186759 out of 5
Keywords:
Id: 6OXdhV5BOcY
Channel Id: undefined
Length: 87min 25sec (5245 seconds)
Published: Thu Jul 26 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.