Entropy and the Arrow of Time

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

I really appreciate your videos, but the entire discussion you give here is predicated on equilibrium thermodynamics, and I feel that needs to be pointed out. Nonequilibrium and dissipative systems are often characterized by spontaneous decreases in entropy. The evidence for this is the fact that you're sitting here making these videos, rather than spewing out microwave background radiation! This is a great overview by Jeremy England on the subject. https://www.englandlab.com/uploads/7/8/0/3/7803054/marsland_2018_rep._prog._phys._81_016601.pdf

More subtly though, I think the issue of the arrow of time is not at all done away with by an appeal to the law of large numbers. Many discrete systems are not time reversible. I think the jury is still out on what time even is, let alone why it has a direction. We impose that structure using time-orienting vector fields in general relativity and time ordering operators in quantum field theory. Ultimately the underlying assertion is that the principle of least action holds and we can derive time translation operators in this manner.

πŸ‘οΈŽ︎ 21 πŸ‘€οΈŽ︎ u/kalakau πŸ“…οΈŽ︎ Oct 13 2021 πŸ—«︎ replies

Hi everyone!

I wanted to share with you my latest video in which I tried to tackle the concept of entropy. In particular in this video I wanted to address entropy as a "fundamental" notion, a probabilistic concept, instead of from the usual physics point of view.

I hope you will enjoy the video, it's definitely not the only possible approach, and it might not be the best, but it's a point of view I am very keen on, and which is often forgotten in my opinion.

Entropy is a subtle notion, very hard to grasp and explain, so I would be really interested to read your feedback about the video!

Alessandro

πŸ‘οΈŽ︎ 27 πŸ‘€οΈŽ︎ u/AlessandroRoussel πŸ“…οΈŽ︎ Oct 13 2021 πŸ—«︎ replies

Your videos are outstanding. Thanks for taking the time to make them.

πŸ‘οΈŽ︎ 9 πŸ‘€οΈŽ︎ u/MrZaphod-B πŸ“…οΈŽ︎ Oct 13 2021 πŸ—«︎ replies

Just want to say your videos are the best! Keep em’ coming!!

πŸ‘οΈŽ︎ 3 πŸ‘€οΈŽ︎ u/Seaguard5 πŸ“…οΈŽ︎ Oct 13 2021 πŸ—«︎ replies

Nicely presented. Kaon decay also gives us an arrow of time which interestingly agrees with the entropic arrow of time.

πŸ‘οΈŽ︎ 4 πŸ‘€οΈŽ︎ u/womerah πŸ“…οΈŽ︎ Oct 13 2021 πŸ—«︎ replies
Captions
[Music] welcome back to science clique today entropy and the arrow of time consider the following two images the first represents a precise structure an apple the second has no structure it is homogeneous now imagine that we generate a third image by randomly choosing the color of each pixel once the image is generated we can pair it with the previous two which of these two images does the new image resemble the most visually the random image looks more like image number two homogeneous both have no particular structure unlike the apple if we keep generating new random images most of them will resemble image number two the probability of randomly creating the image of an apple is very low in general we will rather get a disordered image without structure it is this property of looking like something random that we call entropy here image number two has more entropy because it more often looks like a random image [Music] in our universe entropy is defined in the same way the universe is made up of atoms which can arrange themselves in many ways and some of these arrangements seem homogeneous like a gas whilst others have structures like an ice cube or an apple if we randomly generate a new arrangement by choosing the position and speed of each atom we are more likely to create a homogeneous arrangement which will look like a gas rather than an apple or an ice cube therefore we say that a gas has more entropy because it looks more like a random distribution more specifically the entropy of an object is a number which counts among all possible arrangements those which resemble our object the idea is that at our scale the lack of precision of our measurements and the very large number of atoms prevent us from seeing the differences between these arrangements although they are technically different at our scale they all seem identical let's take a look at these two images when we look at them very closely we can distinguish the color of each pixel and it is therefore possible to differentiate one from the other but if we look from further away the precise details of each image are no longer visible and the two images now look identical [Music] going back to physics the entropy of an ice cube is thus lower than that of a gas because there are many more configurations that look like the gas rather than the ice cube in a way entropy measures the degree of freedom that atoms have it tells us whether our object requires a precise configuration or whether it can fluctuate through many arrangements while keeping the same appearance [Music] in a low entropy system the water molecules do not have much freedom they only have access to very few precise configurations and their agitation is very weak which explains why the ice cube is cold conversely in a high entropy system such as vapor the molecules are much freer and can fluctuate through many different arrangements they are very agitated hence the high temperature intuitively the notion of entropy therefore allows us to characterize the state of matter solid liquid or gas when we boil water for example the energy we provide it with heat allows us to increase its entropy that is to say to free the molecules which compose it to give them a more random disordered behavior and thus transition to a phase of higher entropy vapor the notion of entropy is very useful in physics and chemistry in particular for characterizing states of matter but as we have seen with the example of images the notion is much more fundamental and can extend to a multitude of other fields [Music] in mathematics shannon's entropy measures the amount of information contained in an object such as text a repeating text contains little information because it suffices to identify the repeating formula to fully describe it a more random less structured text contains a lot of information because the only possible way to describe it is to indicate each character one by one [Music] in computer science algorithmic entropy is quite similar and measures the complexity for a computer to generate a specific object the more information the object contains the less it is possible to compress the code that describes it the idea of entropy is even used in the study of biodiversity because it allows us to quantify the range of variations within a set of elements it is also useful in the study of chaotic systems to characterize their unpredictable random behavior such as the fact that in the long term a double pendulum can be affected by very small disturbance [Music] finally the study of black holes allow us to assign these entities an entropy which measures the amount of information they have swallowed the entropy of a black hole is distributed over its surface and the more information it captures by absorbing new objects the larger the black hole becomes [Music] finally entropy plays a crucial role in our understanding of time and more particularly the direction in which transformations happen [Music] to understand imagine the following two configurations on the one hand a balloon is open inside a box and the balloon contains a gas with high pressure on the other hand we imagine the same balloon open but this time the gas is evenly distributed inside the box [Music] assuming that the box evolves spontaneously can we guess the chronological order in which these two situations occur was the gas first compressed in the balloon and then escaped over time or was it the other way around was the gas initially homogeneous and then compressed into the balloon intuitively our daily experience tells us that the gas has escaped from the balloon that it would tend to diffuse rather than spontaneously return inside in general we have the intuition that physical systems tend to homogenize over time that tensions like the pressure in the balloon tend to relax if we put a hot object in contact with a cold object we expect for example that their temperatures would gradually equalize yet on a microscopic scale the two states do not seem to have a fundamental difference it is difficult to understand why the phenomenon would necessarily occur in one direction rather than the other it is at this point that the notion of entropy is crucial we saw previously that the entropy of an object is greater when the object seems homogeneous the second state therefore has greater entropy than the first and it is precisely for this reason that the process occurs in this direction rather than in the other when we let a system evolve spontaneously its entropy tends to increase making the system more and more homogeneous over time the entropy of an isolated system always tends to grow but behind this very powerful principle actually hides a simple logical explanation to understand it let's go back to our analogy from the beginning a structured image that represents an apple [Music] at every instant we imagine making the color of each pixel of the image fluctuate slightly and we observe its evolution over time at the microscopic scale the image fluctuates and gradually drifts away from its initial configuration as time passes it will go through multiple different arrangements passing randomly from one to the other but at our scale the image of the apple seems to gradually fade it becomes more and more homogeneous until perfectly resembling the grey image we had seen at the beginning by fluctuating at random the image itself has become more and more random and as we saw at the beginning of the video an image whose pixels are random has a very high probability of being homogeneous in our universe atoms are agitated and their microscopic distribution constantly fluctuates passing spontaneously from one configuration to another if we wait long enough the structure of the universe tends to be more and more random on a microscopic scale and of random structure most often tends to be homogeneous [Music] as time goes by the structures in our universe have the tendency to blur while its total entropy continues to increase the entropy of the universe measures the progression of its homogenization over time [Music] to conclude entropy is a fascinating concept although very concrete allowing us to characterize the state of mata and their transformations it ultimately boils down to very simple ideas in particular entropy explains the arrow of time the direction in which the universe evolves through simple probabilistic logic atoms can be arranged in many different ways but from our point of view some of these configurations look the same and in particular those which are the most homogeneous as there are more of them by fluctuating little by little the universe naturally tends to adopt them in the long term today the question arises as to whether the universe will end up being completely homogeneous one could imagine that by constantly increasing its entropy would eventually reach a maximum that the universe would reach a point of equilibrium and that the arrow of time would disappear only microscopic fluctuations would remain however the universe is expanding and the phenomena of general relativity linked to its structural geometry can perhaps prevent it from this scenario of heat death [Music] you
Info
Channel: ScienceClic English
Views: 176,828
Rating: undefined out of 5
Keywords:
Id: NfTmy1ApCvI
Channel Id: undefined
Length: 12min 37sec (757 seconds)
Published: Tue Oct 12 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.