Concepts of Thermodynamics
Prof. Suman Chakraborty Department of Mechanical Engineering
Indian Institute of Technology, Kharagpur
Lecture - 42
What is Entropy? In the previous lecture, we were discussing
about the concept of entropy generation and its physical understanding. So, with this
background, we will learn about a very important principle, which is principle of increase
of entropy. So, let us say that you have a thermal reservoir,
which maintains its temperature T naught, and there is a heat transfer from these to
a system which is del Q, and the system temperature you know it changes based on the heat transfer.
So, the temperature at the system boundary is T, which is different from T 0.
So, then you can write dS of the system if you consider this as your system as del Q
over T, what is dS for the surroundings. This is the surrounding, which is the thermal reservoir
if the sorry right equality for reversible inequality for irreversible. What is dS for
the surrounding? For the surrounding the heat transfer is minus del Q. So, if you when you
are writing it for a system, you have to imagine a surrounding, you have to imagine the surrounding
as the system. So, for surrounding as a system, the heat transfer is minus del Q, its temperature
remains constant, and because it is a vast thermal reservoir, it can be considered to
be a reversible process for the surrounding. So, dS system plus dS surrounding this is
greater than equal to del Q 1 by T minus 1 by T 0. Can you tell for a heat transfer to
be feasible, which one is more T or T 0? Definitely T 0 is more. So, T 0 is greater than T. So,
1 by T is greater than 1 by T 0 right that means and del Q is positive. So, dS system
plus dS surrounding, it is greater than equal to 0.
System plus surrounding together is called as universe. So, dS universe is greater than
equal to 0; this is called as principle of increase of entropy. So, with every process
taking place in the universe, the entropy is continuously increasing. This may give
you know if with every statement we make, we have to be careful. I have seen students
interpreting this as entropy always increases during a process. So, this is entropy of the
universe, entropy of the system may increase or decrease, because you look at this expression;
S 2 minus S 1 is equal to integral of del Q by T plus S generation.
This is S 2 minus S 1 of the system. So, when can entropy of the system decrease? See this
is always positive, no doubt about it, it can be 0 in the minimum. Integral of del Q
by T, this can be positive or negative depending on whether del Q is positive or negative.
So, this is plus or minus depending on whether del Q is plus or minus. So, sum of these two
can really be anything; plus, positive, negative, 0. If the sum of these two is positive then
entropy of the system increases; if sum of these two is negative that means, entropy
of the system decreases. And if sum of these two is 0 that has two
possibilities; one possibility is this is 0, and this is 0 that is reversible adiabatic
process. So, change in entropy equal to 0, one possibility is it is reversible adiabatic,
but there is another possibility, what is that? Let us say this is minus 10 and this
is plus 10, then also it can be 0, but that is not reversible adiabatic. So, change in
entropy during a process equal to 0 does not necessarily mean that it has to be reversible
adiabatic right that you have to be very clear about.
So, here the change in entropy of the system plus that of the surrounding that is greater
than equal to 0. So, entropy continuously increases that is true, but it is not entropy
of the system as just like that. It is entropy of the system plus surrounding that is entropy
of the universe. So, this again is a very important concept that we have.
dS surrounding equal to minus to minus del Q upon T naught. Why did not you write it
equality . So, normally you know these surrounding that
we are talking about it is undergoing a reversible isothermal process. So, these temperatures
this is an example where we are having a fixed T 0, but variable T. So, typically is why
we are writing these is because we are normally considering the surrounding as a thermal reservoir
that is the you know common thermodynamic situation. And when the surrounding itself
is a thermal reservoir for any thermodynamic process, you will always have a thermal reservoir,
may not be the immediate surrounding but you know the ambient, which is outside the immediate
surroundings. So that thermal reservoir is correctly represented by this equality rather
than inequality, so that is why this equality is used here.
Because for the thermal reservoir, see you know you can indefinitely add or subtract
heat with without much change. So, it is a very slow heat transfer with constant temperature.
So, for the surrounding it is like a reversible isothermal process. Now, we have discussed
so much about entropy and the funny thing is that even after all these we have not really
understood what is entropy, and you know this is always a candidate fooling question.
So, I have seen that in many interviews the people who are asking questions. So the candidate
appears in the interview and after appearing for the interview, the examiner with a very
nice smile says what is your favorite subject? This poor candidate maybe he has done so well
in his thermodynamics courses or she has done very well in her thermodynamic courses. So,
quickly he or she says that thermodynamics is my favorite course, and then you know you
are under trap. So, then the next question is have you heard
what is entropy, and of course, anybody who has done thermodynamics, will at least know
what is that there is a term called entropy. So, then the question the next question is
an obvious extension can, you explain what is entropy. And then you know I mean it is
always easy to ask these questions, it is very difficult to give answer to this. The
privilege of the examiners is that they are the seat in the other side of the table.
So, they can ask anything, and they can expect that at an age where when people are not that
matured, they are expected to give a very mature generalized answer of what is entropy,
it is very, very difficult. Because again as I told you in the very beginning that you
know understanding second law entropy all those things is like you know by visualizing
an elephant from different angle. Like, whether you are looking at the ear or tail or whatever
based on that you give your own interpretation. So, I will like to give you some interpretations
and these are based on some interpretations of entropy in some chosen fields, but you
have to understand clearly that this does not cover everything. So, just to you know
divert a little bit from the same thing heat transfer work done for long. So, what I would
do is, I would like to give an example of information theory. So, entropy is a very important concept used
in information theory. I am not a specialist in information theory, but I can just you
know give you certain analogies, which have equivalent presence in classical thermodynamics.
So, in information theory, there is there is a terminology, which tells you that how
much information is contained in a given content in a given data. So, I mean why do you need
it? So, let me give you an example. Let us say that you know many people have this habit
after waking up in the morning, they will look into the Smartphone or computer and browse
the latest news in the in their favorite newspaper. So, the newspaper webpage has certain news.
So, let us say some hypothetical newspaper gives a news today sun rises in the east,
let us say one sentence, today sun rises in east. So, now, this is a fact. I mean you
cannot say that this is wrong, but this will not create any sensation. So, this, the information
content within this is very little or no information I should say. So, on the other hand, some
news media just to create sensation, I mean sometimes you know this kind of thing are
done also purposefully. To create a sensation, some news, which discusses about very unexpected
event that unexpected news is brought into highlight. Say somebody writes that sun rises
in the west today and very hypothetical impossible, but I am just giving you an example.
I have seen that while traveling in railways that there are you know wall I mean statements
written in the wall that the sun moves around the earth, and this has caught attention of
many people, because you know this is something which contradicts their learning in education.
So, this will you know this will create immediate sensation although people might later on realize
that is not correct and so on, but who will verify I mean before verifying the number
of clicks the so called likes in our you know I mean social media pages and all those things
will blow up like anything. And this is what people want you know. These days people are
more concerned about how many likes do you get in social media rather than you know what
is the serious, seriousness of what you are interested to convey.
So, to bring the bottom line here, the amount of information is proportional to log of 1
by probability of occurrence of an event ok. So, if you have a event with very low probability
1 by low probability will make it a large number and log of that will be large and therefore,
the information content will be high. So, there is a definition of entropy of information
associated in information theory, this is summation of this is basically a weighted
average of the information content ok. So, this is called as the entropy of information.
So, you know more entropy of information will essentially implicate that you know it, it
is information rich. So, then this is information theory.
Now, let us consider an equivalent thermodynamic classical thermodynamic situation. When we
have a system with a total energy volume temperature etcetera, but this system has many so this
is a macroscopic state, when we say the system has a particular temperature, the system has
a particular energy, particular volume, this is a macroscopic picture. Now, there could
be many possible microscopic states that could and then give rise to a final same macroscopic
picture. So, let there be N number of possible microstates for a given macro state ok. So, you give more
freedom, there will be more number of such microstates. If you give less freedom to the
system, if the system is more constrained, there will be less number of macro states
microstates. So, now if you relate that what is the probability
of occurrence of each of the microstates, then that can be linked with this one with
the probability of occurrence of each microstate as 1 by N, because there could be N number
of possible microstates, each of these microstates could be equally probable I mean there is
no bias between these two. So, probability of each microstate is 1 by N.
So, then you can have a weighted average. So, you can have a similar definition of entropy
which will be now proportional to; so, ln of N, 1 by P i will become N. And summation
of P i each is 1 by N and this is summed over N number of possible states, so that will
be N by N. So, it will be proportional to ln N. Normally, this is what is called as
Boltzmann’s concept of entropy from a microscopic angle. Now, this, the problem is that when we work
with classical thermodynamics the entropy relates to what? See entropy relates to heat
transfer by temperature right that is the unit, so that unit does not match with this.
To match with this unit Boltzmann proposed a definition that this proportionality is
blocked or is covered by this constant which is the Boltzmann constant. So, this is the
Boltzmann constant. So, this is roughly what? So this is the universal gas constant divided
by Avogadro number. So, this will be what? 1.38 into 10 to the power minus 23 joule per
Kelvin right. So, now this entropy into temperature will
become energy unit joule. So, this Boltzmann constant was introduced just to breach this
probability theory based definition with the classical definition of entropy through work
and heat. And this is called as Boltzmann’s entropy. So, this is how it is interpreted
in microscopic thermodynamics, but it all depends on the specific level of interpretation
that you give. Now, based on this what can we say? We say
that, you know if it is a highly ordered system, then the number of possible microstates is
less, because if the system is very ordered it is very constrained. So, it will not be
chaotic type where you know there is a large number of possible microstates. For example,
you think of a regimented soldier, a regiment of soldiers, they are very ordered. So, everybody
will do the exercise in the regiment of soldiers in exactly the same way, and this is perfect
order. In a military when the training is taking place, you will find that they are
so regimented they will follow exact order. Exact disorder, you imagine a junior school
and there is a lunch break or tiffin break after the class. All the young kids will come
out of the class literally they will jump out of the class and float around run around
using whatever space that is there in the school lawn, and that is absolute disorder.
So, why that is absolute disorder, because you know there are large number of possible
microscopic states, every student, every child can act in his or her own way. But eventually,
there is a macroscopic state that is they are prevailing in the school and that macroscopic
state is defined by that in the tiffin hour. Within that I mean there is a chaos and large
number of possible microscopic states, so that is a highly disordered system.
If you imagine a thermodynamic system, a partition on one side gas, another side vacuum. You
allow the gas to expand freely and that is perfect disorder, because you know the gas
will randomly spread across all. But if you constrain it to move in a quasi equilibrium
process, then it is more regimented like the soldier. So, we can understand that you know
more is the disorder, more is the number of possible microstates, and more is the number
of possible microstates, greater is the entropy. So, to summarize we can say that entropy is
a measure of disorder and because the change in entropy of the system and surrounding together
or the universe is positive that means we can summarize that disorder is increasing
in the universe in some way of the other. So, why disorder is increasing, because it
is more natural to have disorder rather than order; if you keep a system freely, the system
will naturally choose disorder. I have seen that on one day I have organized my office
very nicely, all the files on one side, all the books on one side, all the pens on one
side, maybe the board eraser on one side, the marker pen on another side, all the thesis
that I have examined on one place. So, it looks very nice. So, I have given a lot of
effort to do that. After 7 days I find that with whole lots of files and papers submitted
by my students to correct and these and that the entire office is in disorder. I have not
given any special effort, but the system has actually preferred to go to a disordered state
spontaneously. So, having a disorder is more spontaneous
than having an order. So, and this leads to one of the very important definitions, the
definition of absolute entropy. The definition of absolute entropy is governed by the third
law of thermodynamics. And I will briefly talk about this in this particular course
we do not have a provision of you know discussing about the third law of thermodynamics. But what it says is that entropy of a perfectly
crystalline substance is 0 at absolute 0. So, this is very important, because the second
law as such does not define entropy. It only, quantifies the change of entropy, but the
third law defines absolute entropy and it is based on this physical understanding that
entropy is a measure of disorder. So, at absolute 0, you expect the disorder
to be minimum because the thermal energy is not there, but even then there could be disorder
because of imperfection within the system. So, if it is a perfectly crystalline substance,
then it is absolute order; and perfectly crystalline substance at absolute 0, of course, is hypothetical,
but that hypothetical state is considered to be a state of absolute order and therefore,
0 entropy, so that is what the third law talks about.
So, finally, you know I would like to give you some other example beyond, you know the
heat transfer work done and even information theory. And this is an example of stretching
of a spring, this is you know a very simple example that can relate mechanics with thermodynamics.
So, you have a spring like this. By a force F, you stretch it. Question is does its entropy
increase, decrease, what does it do? So, now we will use, so let us use this T
dS. Now can we use T dS equal to dU plus p dV? In this case not, because p dV is not
the relevant work here. What is the relevant work here? F dx right. Is it positive or negative?
See here work is done on the system to stretch it. So, if work is done by the system, it
is positive, if work is done on the system it is negative; so minus F d x ok. So, for
if it is taking place isothermally, then this is 0, if it is taking place isothermally,
then you have dS d x ok. F is positive; T is positive absolute temperature that means,
as x increases, the entropy decreases. So, during stretching of a spring as it is
stretched more and more, its entropy decreases; why, because when the spring is you know free,
it can attain a large number of microscopic states, but when you are stretching it you
are making it more and more constrained. You are making it more and more constrained that
means, you are allowing less number of microscopic states for a given macroscopic state, and
that makes the entropy decrease. So, I hope that I have given you some examples
which can illustrate the concept of you know what is entropy. Although it is very difficult
to answer as I told, but through examples and by relating various fields, we can perhaps
attempt to develop a unified understanding of entropy and how it relates to practical
processes. Thank you very much.