If the universe is as we think
it might be, a sort of closed system
surrounded by a horizon, and that that horizon behaves roughly
like the horizon of a black hole. Any self-contained system, if it satisfies
any kind of version of the standard laws of physics, entropy will increase until it comes to thermal equilibrium. Once it comes to thermal equilibrium, um... not much happens. Thermal equilibrium is extremely boring. It really is a dead world. However, the classical story of the increase of entropy misses
part of the evolution. Quantum entanglement is still evolving, things keep happening. We propose the second law of
quantum complexity in direct analogy to the second law of thermodynamics. This is a really provocative suggestion. It struck me as something that did not compute. They came to the conclusion... old Lenny used to be a good physicist, he got into this complexity story,
but he didn't know what complexity was. The standard laws of physics are given their most rigorous stress tests
inside the most exotic of cosmic locales: black holes. On top of the much buzzed about work
on singularities and event horizons in the 20th century. A new enigma has appeared having to do
with how big black holes are on the inside. About a decade ago, Stanford physicist,
Leonard Susskind, found that the interior spacetime of a black hole is so warped
that it can seemingly grow forever. This appears to conflict
with the same laws of physics that bring a glass of ice water
to thermal equilibrium. Those laws say that entropy
should increase toward a maximum value, at which point things in the universe, including black holes, should stop
changing in any meaningful way. A fate referred to as a "heat death." At this point, why would
black holes continue to change? Susskind and his collaborators
set out to solve this paradox, and in doing so,
they stumbled upon an unlikely explanation for the near infinite expansion
of a black hole's interior. My friend, who was my friend
and a good friend, Feynman, he would close his eyes
and visualize a system. And then when
he had a visual picture of it, he would convert that to mathematics. I don't have the mind of a mathematician, I have the mind of an auto mechanic. What works is what's practical, but sometimes you need some mathematics. I learned about black holes,
and there was something about black holes that had nothing to do
with complexity at that time. But there was a puzzling feature
that I didn't understand. We had these pictures, they were called Penrose Diagrams. It's a kind of map of the whole universe,
including a black hole. The black hole is over here. Time goes upward. The interior of the black hole
had this strange behavior. The interior of the black hole
came to a smallest size and then expanded. It bounced and it started to grow. And I didn't know what
that bouncing thing was. And I gave up,
I didn't know what to make out of it. I would ask my friends,
what's growing here? Black hole experts –
and they would.... I don't know... it's just the volume of the black hole. Okay. Physicists
started to look towards entropy, a concept that has a long history
in science and engineering. Entropy was invented by people
during the Industrial Revolution who wished to understand
the maximum possible efficiency of steam engines
and other similar machines. And what they were interested
in was the entropy of the working gas in their engine. So this was both a important theoretical
understanding and also big business. That is the story, though, of entropy -
a statistical claim that this disorder of the system
always increases. But for Susskind, something didn't fit. I knew that it wasn't entropy
because the entropy of the black hole
comes to thermal equilibrium very quickly. On the other hand,
you can look at the Penrose Diagram and you could see that expansion
lasted for a very, very long time. So if you start with black coffee and milk and mix them together, it's not going to
take very long before you, in fact, have at least, as far as the intermixing
is concerned, thermodynamic equilibrium. And in fact, it'll look at least
as far as thermodynamic is concerned, like the arrow of time has stopped, the thermodynamic arrow of time
has ceased. The system is now in a heat
death situation. There is no longer
any ability to generate more entropy. However, that is not the end of the story
because, even though the system has reached thermodynamic equilibrium,
it is not reached complexity equilibrium. Complex systems are characterized by individual parts
that interact in a multiplicity of ways, leading to phenomena like non-linearity, randomness, and emergence. Sensing that complexity might be at the root of the explanation for why black holes
continue to grow past thermal equilibrium, Susskind and his collaborators turned to the mathematics
of theoretical computer science. Feynman taught us
that the number of quantum states of a system is enormously large. If you have a certain number of qubits, the number of orthogonal quantum states of that system
is exponential in the number of qubits. What that means, is that for example,
a quantum computer or just an ordinary system will take an enormously long time to explore the entire space of states. It will start out at some simple state and start
wandering around in the space of states much, much longer time than it takes
to establish thermal equilibrium. All of a sudden it would be bang - the complexity of a quantum state just
increases and increases and increases, long, long past the time when the system
has apparently come to thermal equilibrium And that this size of the interior
of the black hole was a direct measure of the complexity
of the quantum state of the black hole. When that hit me, I suddenly got really, really interested in the concept
of computational complexity. So a classical system, we basically describe it like, Oh, I have a bunch of bits
and they can be 0101, right? I can just describe them individually
what state they are. But for a quantum system,
that's not enough. We, we cannot describe the system
by starting from the first qubit and then describe the second cubit
because they can be globally entangled and locally, there might not be information
at all, but globally they might be in some highly
non-trivial quantum state. Entanglement is the key to
why complexity can grow so large for a quantum system. Working with his Stanford colleague
Adam Brown, Susskind developed the idea that even after
classical entropy has reached its maximum, the interior of a black hole
can continue to evolve, because of the ever increasing complexity
of its quantum state. Meaning that there's life after heat,
death for black holes after all. And to make
their case to the scientific community, they borrowed a longstanding concept from computer
science known as Circuit Complexity. Originally formulated to measure the
complexity of a computer program's output. Quantum circuit complexity had become
the unlikely mathematical language for a new theory attempting to explain
the evolution of black holes, a theory that wasn't without its critics. I first heard that there was a connection being drawn between circuit complexity
and black holes way back when I was in grad school, when Lenny Susskind and his collaborators
made the suggestion that circuit complexity might be playing
a role in understanding quantum gravity. And when you first hear this is a computer scientist,
this is a really provocative suggestion. Lenny had formulated
this, and I happened to be at that talk. And what I found hard to believe about it is that he was equating something
that is physical, you know, volume, with something that, as a complexity theorist,
I would say, you know, this is quite un-physical: Quantum Circuit Complexity. That's something
that's manifestly hard to compute. I'm not sure we have particular
hypothesis, but something felt funny. To put Susskind and Brown's Quantum
Circuit Complexity Theory of Black Holes to the test. Adam Bouland and his collaborators
brought in tools of modern cryptography to serve as a testing ground. The state that Susskind
and his collaborators were considering actually look a lot, like a quantum
analog of a block cipher. A block cipher is a type of algorithm
that underpins modern cryptography, including the technology
behind cryptocurrencies. By scrambling information
behind an encryption key-phrase that must be entered exactly
right, to gain access to the information. For Adam Bouland,
the similarity to Susskind's theory was in the chaotic mixing
within a quantum system. His work drew a comparison between
scrambling the information in a cipher and the rearrangement of qubits, as a quantum system evolves... backing up Susskind's approach. Funilly, after this long
period of exploration, our work ended up in some roundabout way, justifying
Susskind and collaborators' conjectures. That was a really unexpected
result of our investigation. We realized that rather than reinforcing
our skepticism about his proposal, what our results suggested
was that, in fact, that was the only sensible way
to resolve the paradox. So that was a
that was a really interesting thing, you know, that
that he had this deep intuition. It's pretty remarkable! Faced with a possible explanation for life
after heat death, Susskind and his collaborators decided to propose
a new fundamental law of the universe. Eventually,
we just said, let's put forward the idea that computer scientists
had never put it forward... they hadn't said it. Let's put forward
the idea that there is a second law of complexity. That complexity behaves
very much like entropy, and all the ways on the average increases until it maxes out. We proposed
the second law of quantum complexity in direct analogy
to the second law of thermodynamics. I would say that it is not a law
in the sense that it's been proved. It is a conjecture that there should exist
such a law. But the analogy with the classic second
law of thermodynamics is sufficiently strong that we felt
emboldened to propose such a law. At the moment, I think it's something
which really applies to black holes. The extent to which it applies to the
universe as a whole is not entirely clear. So this idea of using quantum
circuit complexity, this is treating quantum system
in a quantum mechanical way. Rather than starting
from a classical starting point into some semi classical approximation and and then bootstrap from there, that can take us somewhere, but it's not all the way. So once we accept that the world
is quantum and we fully embrace that, then we can actually understand things
much, much better because we're now
standing inside the quantum world and we're using fundamentally
quantum language to describe it
as we should have for a long time. And that will be a fundamental change of how
we think about these kind of systems. What the implications of complexity growth
are for the evolution of the universe. At the moment,
this is unexplored territory. Eventually,
if you wait an exponentially long time, even the quantum complexity achieves
its maximum value. Even the quantum complexity,
which is its own analog of heat death.... and then maybe the story begins again. You know, as people say, if you wait long enough, everything will happen,
including what you started with.