Physics seems to be telling us that it's possible to simulate the entire universe on a computer-- smaller than a universe If we go along with this crazy notion, how powerful would that computer need to be? And how long would it take? Believe it or not, we could figure it out. Look, I'm not saying the universe is a simulation I mean, it might be, I'm just not saying it. And, perhaps, it doesn't make any difference. Even if this is the prime, the original physical universe rather than somewhere deep in the simulation nest We can still think of our universe's underlying mechanics as computation. Imagine a universe, in which the most elementary components are stripped of all properties, besides some binary notion of existence or non-existence. Like if the tiniest chunks of space-time or chunks of quantum fields were elements in the abstract space of quantum-mechanical states can either be full, or empty These elements interact with their neighbors by a simple set of rules leading to oscillations, elementary particles, atoms, and ultimately to all of the emergent laws of physics, physical structure and ultimately, the universe. I just described the Cellular Automaton Hypothesis. In this picture the universe is a multi-dimensional version of Conway's Game of Life. Such a universe could be reasonably thought of as a computation, cells stripped of all properties until they are indistinguishable from pure information. and together they form a sort of computer whose sole task is to compute its own evolution. This may not be how our reality works, but it's an idea that many physicists take seriously And even if we aren't emergent patterns of a cellular automaton, we can think of any physical reality as a computation, so long as its underlying mechanics are rules based evolution over time. That includes most formulations of quantum-mechanics and proposals for theories of everything. We'll come back to the question "Is the universe a computer" and we'll look at cellular automata and pan computationalism, and in general the idea of digital physics and an informational universe. But today lets answer a simple question: "If the universe is a computer, how good of computer is it?" And an even more fun question: "Could you build a computer inside this universe to simulate this universe?" In answering this, we'll also answer the recent challenge question, and I encourage you to watch that episode before you watch this one. The power of a computer can be crudely broken down into two things: How much information can it store, and how quickly can it perform computations on that information. The laws of physics prescribe fundamental limits on both. The first one: The memory capacity of the universe is a topic we've looked at. It's defined by the Bekenstein Bound which tells us the maximum information that can be stored in a volume of space is proportional to the surface area of that volume. Specifically it's the number of tiny Planck areas you can fit over that surface area divided by 4. It was in studying black holes that Jacob Bekenstein realized that they must contain the maximum possible amount of information, the maximum possible entropy. If you fill a region of the universe with information equal to it's Bekenstein Bound It'll immediately become a black hole We saw in our episode on the information content of the universe that the maximum information content the Bekenstein Bound of the observable universe Is around 10 to the power of 120 bits, based on its surface area. At the same time the actual information content in matter and radiation is probably more like 10 to the power of 90 bits roughly corresponding to the number of particles of matter and radiation. Bizarrely, the Bekenstein Bound suggests that we could hold all of the information in the observable universe within a storage device smaller than the observable universe. Which brings us to the first part of the challenge question: Assuming you can build a computer that stores information at the Bekenstein Bound, essentially your memory device is the event horizon of a black hole. How large would that black hole need to be to store all of the information about all of the particles in the universe? We'll figure out the case for just matter, and for matter and radiation.
This is such a contrived and backwards argument. First give an estimate for the information necessary to describe the Universe that is way smaller than the Bekenstein bound (because of a confusion between a single configuration and the state space) to then use the Bekenstein bound to claim it can be crammed into a BH smaller than it? The statement that results, that you can simulate the Universe in a computer smaller than it, is wrong and in clear contradiction with the BB.
To simulate a system, you should be able not just to encode its current configuration, but any possible configuration. That requires an amount of information that is given by the saturated BB. So to simulate the Universe on a BH, for example, if you want to minimize computer size, then the BH has to be as big as the Universe.
P.S. I would also add that counting up the info in radiation and matter is also a severe underestimation of the information in the current state of the Universe, as it is actually dominated by far by entropy of black holes. It really doesn't matter though, because again the information encodable in one state is meaningless; what one needs to process to simulate is the information that distinguishes a particular state from all the others, which is S_BH >> 1090.
Such a great channel.