A New Thermodynamics

www.newthermodynamics.com

By Kent W. Mayhew

 

Entropy: A Parameter Lacking Clarity

 

   Entropy is the thermal parameter, which when multiplied by temperature gives energy. Clausius’s mid 19th century logic sounds so simple even today. Yet of all the thermodynamic parameters, entropy remains the one that lacks any real clarity.

  

By the middle of the 20th century, entropy was accepted as being the “randomness of matter in incessant motion”(1).  Again sounding so simple; upon my hearing this in high school science, I just simply accepted it, without pondering what relationship does randomness have to do with energy anyhow.  It was blind acceptance resulting in a strong foundation, making me ready for my university indoctrination.  

 

Consider a barrel of hot water. Certainly we can envision that this barrel required, and then stored, energy as it was heated. But the heating of the water resulted in no real observable randomness changes at a macroscopic level. Even at a microscopic level, although the water molecules are vibrating with more energy than they were before being heated one ,might ask: “are the molecules really any more random?” We can certainly understand the argument that the molecular vibrational energies have increased, but then  to say that equates to randomness just seems so subjective. 

 

Interestingly, Arieh Ben-Naim (2) (in his book) points out that the interpretation of randomness of gaseous molecules really lay in the eye of the beholder. Arieh gives several examples and he is right in saying that the term “randomness” is one that lack scientific foundation. 

 

Consider Clausius’s 19th century choice of the term “Entropy” (2): “I prefer going to the ancient languages for the names of important scientific quantities, so that they mean the same in all living tongues. I propose, accordingly, to call S the entropy of a body, after the Greek word “transformation”.  I have designedly coined the word entropy to be similar to energy, for these two quantities are so analogous in their physical significance, that an analogy of denominations seems to be helpful”

  

In his book’s preface Arieh Ben–Naim(2) then goes on to write: “Right after quoting Clausius’ explanation on his reasons for the choice of the word “Entropy”,” Leon Cooper (1968) “commented: “By doing this, rather than extracting a name from the body of current language (say: lost heat), he succeeded in coining a word that meant the same thing to everybody: nothing”” 

   

Ben-Naim(2) and I both agree with Cooper’s assessment of entropy. Ben-Naim further rightfully states that energy and entropy are not analogous and that “lost work” might not be the best choice either.  In context of my book, one can plainly see that a reason for entropy’s initial conceptualization was to explain lost work by expanding systems, i.e. Carnot engine. Of course my simple explanation of the displacement of earth’s atmosphere was not envisioned by anyone before including Ben-Naim. Rather Ben-Naim choice was to tweak the sciences in a bizarre yet interesting marriage of Shannon’s information with traditional thermodynamics.   

  

A more recent 21st century definition is that entropy is “the dispersal of a system’s molecular energy” (3). This is Frank Lambert’s beloved statement. It can certainly be argued that Frank’s interpretation is an improvement; even so, I am sorry to say that to me, Frank’s definition for entropy is also lacking real clarity.

   

  Certainly, I do prefer the term dispersal over randomness because it invokes the visualization that molecules and/or energy will tend to disperse. Their dispersion is  limited by the constraints imposed upon a system, so long as the system is given sufficient time to attain equilibrium. Constraints can take many forms. Walls tend to contain gaseous molecules preventing them from simply traveling outwards until their next collisions with another gaseous molecules. Gravity being the mother of all constraints, as it holds galaxies and all their various components together. And of course pressure can be considered as a force constraint upon a system, but often it really is just a subset of gravity acting as a force of constraint. 

  

Note: I apologize the use of symbols below for parameters. In the book I use the word math editor, but here in html well it is do the best I can. anyhow the importance is the points that are made and not mathematical symbols.

   

  We can ask: Mathematically what is Entropy?

   

    The simple relation conundrum?

   

  As previously stated entropy lacks clarity. This is not due to the efforts of Frank Lambert, and countless others, to give it real meaning. The reason entropy lacks clarity is in part due to numerous applications in which it is blindly applied. Consider the enthalpy (H)  relation:

   

              H=E+PV                                                     1a)

 

Which could also be written as:

   

                TS=H=E+PV                                       1b)

  

Certainly based upon 1b) we can see that entropy (S) is something that when multiplied by temperature (T), gives something whose units are energy (SI units: Joules). Specifically, TS equates to the internal energy (E) plus pressure (P) multiplied by volume (V).

 

Note: Eqn  1) is most often written as H=E+PV, thus eliminating TS, which allows for one to ignore its association with entropy. However the association with entropy is beheld in the thermodynamic relation for isothermal entropy change, that being:

 

 TdS=dE+PdV                                               2)

 

The integration of both sides will lend itself to: TS=E+PV. Hence allow us to write 1b). This sort of constructive logic does make one ponder if Enthalpy is so designed to hide the direct association with entropy.

   

  For an N molecule ideal gas we can written the ideal gas law:

   

               PV=NkT                                                     3)

   

 Again it all seems so simple; so why does entropy lack clarity? Well ask what does 1) represent? Be it right or wrong, in high school I envisioned 1) as equating energy and happily went to bed at night, pondering about my teenager life. And then in University I learned kinetic theory, and that the total energy (Et) of a monatomic  ideal gas was:     

   

           Et= 3NkT/2                                                  4)

 

I was also taught that the heat capacity (Cy) times temperature change (dT) gave change to a systems energy change (dQ):

   

               CydT =  dQ                                                 5)

 

And finally that 5) can also be written in terms of entropy (S):

   

               SdT=dQ                                                      6)

 

  Comparing 6) to 5), logic would dictate that:

   

                 S=Cy                                                          7) 

 

7) implies that entropy is nothing more than a form of heat capacity for any system. Heaven forbid such simple constructive logic be utilized. If we embrace 7) then does this mean in going from absolute zero to the current temperature of a gas, that: dT=T-0=T.  And that we might then boldly write for that ideal gas:

   

                  ST=CyT = 3NkT/2                                         8)

  

Although we cannot fully embrace 7), hence 8), now compared them to 3).  Similar constructive logic might give cause to someone to write:

   

                     ST=3PV/2                                                  9)

   

 Surely 7),8) and 9) all embrace constructive logic but if they were applied to thermodynamics then we would end up with the following conundrum. Why does 9) not equate to 3) for an ideal gas? Moreover constructive logic might lend itself to writing for  internal energy (E) equals:

   

                     E = PV/2                                                      10)

   

 Certainly not! 10) is NOT acceptable at any level.

 

Now ask what is entropy (S)? Well based on 7) it is a form of heat capacity but it also is somehow  related to enthalpy, which does not define the true energy of a system of monatomic gas. It lends itself to asking is entropy: 

 

1) Something that when multiplied by temperature gives a system’s enthalpy

 and/or

2) Something that when multiplied by temperature gives a systems energy.

 

 BUT a system’s enthalpy is NOT necessarily equal to a system’s energy.  Consider an ideal gas , its enthalpy is: NkT but its energy is: 3NkT/2.  So entropy can be one or the other but it cannot remain as both!   Does this not imply that from a simple mathematical perspective entropy lacks clarity? Certainly it does! And in many ways this also explains why entropy has no clear concise literal definition.

  

Okay we can avoid the use of the above constructive logic by limiting entropy to being something that when multiplied by temperature changes gives a system’s energy change and then ignoring entropy’s relation to enthalpy. But doing so means that we are just fooling ourselves are we not. And it also means that we must reconsider what the enthalpy relation really is.

 

Even so we have just scrapped the surface of why entropy is so confused as a thermodynamic parameter. Let us now investigate it’s somewhat more complex mathematical identity.

 

Entropy’s more complex conundrum.

 

Certainly,  we can bury entropy further by saying that entropy is defined in terms of the number of microstates (@). I.e. we now have a third (or is it a fourth) mathematical definition for entropy, that being:

 

                      S=k[In@]                                                     11)

 

Eqn 11) is fundamental to statistical thermodynamics. If energy is added to a system then the number of accessible energy states can increase. This embraces constructive logic.

 

Ideal Monatomic Gas: Isobaric isothermal expansion

 

Consider that an ideal monatomic  gaseous system experiences an input of thermal energy and this results in the isothermal isobaric expansion of the system.  Accepting that such an expanding must perform work onto the surrounding atmosphere and that such work is lost energy by the expanding system. i.e. lost work

(Please see blog/discussion on lost work, if you have not). In a nutshell: Basically the atmosphere has mass and an expanding system must upwardly displace this mass and the work lost by the expanding system equals the potential energy increase of the surrounding atmosphere that being: W=PdV.

 

Herein the system’s temperature does not increase because the system’s energy input equates to its energy output that being the work required to upwardly displace the atmosphere’s mass:

   

       -(Energy input) = Wout = PdV             12)

 

In such an idealistic case since the energy within the system remains constant then the system’s temperature remains constant. Seems simple enough but now ask what happens to eqn 11).  If there is no energy change within the system then does the number of accessible energy states increase?

If you accept the traditional assertion that the number of accessible energy states is a function of volume then you should wrongly answer yes.

 

However logic dictates that if the total energy of a system remains constant then the total number of accessible energy states should remain constant. And this is the correct answer, at least if you accept this author’s new perspective. This disavows any correlation between volume and the number of accessible energy states. Which is to say that writing entropy change in terms of volume change is simply a mistake.

 For example chemists and others often write isothermal  entropy change of an ideal gas in terms of  volume change, i.e. let “i” and “f” signify initial and final states then:

 

                            dS=kIn(Vf/Vi)                                13)

 

Part of the reason eqn 13) is wrongly embraced is that scientist realized that there was an energy lost by expanding systems and then they wrongly decided that there is a correlation between a system’s volume and its energy.  This is all based upon their misguided understanding in not realizing that the reason that energy is lost is that the expanding system must do work onto the surrounding atmosphere. In other words eqn 13) considers the work done in terms of the system’s internal volume change rather than the external displacement of the surrounding atmosphere.

 

The above misconception is reinforced by mathematical analysis. Specifically eqn 13) is based upon the Sackur-Tetrode equation combined with the premise that the number of particles and internal energy of an expanding ideal gas remain constant i.e. (Q=W=PatmdV). Note the Sackur-Tetrode equation is traditionally accepted for defining the entropy of an ideal gas.

 

Certainly you will arrive at the same empirical answer thinking in terms of the expanding system’s volume increase i.e. randomness, as you will in thinking in terms of the lost work into the surrounding atmosphere.  But your science will eventually become a complication of the simple if you believe in terms of an isolated system experiencing an increase in randomness rather than a non-isolated system expanding and displacing its surroundings. Hopefully you the reader are now starting to understand the fundamental difference.

 

So what we can now say is that if entropy is defined in terms of the number of accessible states i.e. by eqn 11). However the number of accessible states should only be a function of a system’s energy and not its volume. Hence although eqn 13) is often empirically correct, it is a blunder when it comes to logic. And this as much as anything demonstrates why thermodynamics needs to be rewritten.

 

One may now ask then why does eqn 11) seemingly explain so much. It is a case of the powers statistical thermodynamics but not the logic that he bestowed onto it. Specifically Boltzmann equated his mathematical theory so that his constant (k) allows his formulation to explain lost work that being: PdV, here on Earth. Furthermore when developing equations like the  Sackur-Tetrode we employ traditional concepts in statistical thermodynamics and also set the number of accessible states as a function of a prescribed volume (one that obeys Heisenberg’s uncertainty), hence for the isothermal expansion 13) applies, thus reinforcing our traditional belief’s in entropy. When all this is actually based upon  circular reinforcing logic.     

 

So what about entropy itself? We can conclude that the science of thermodynamics requires a rethink. And this will require bestowing entropy with clarity. How it is actually defined may alter its application but ultimately the science will simplify and entropy will either follow Phlogiston or become a parameter with clarity.

               Another Example

 

It must be emphasized that there is absolutely nothing wrong with counting the number of accessible energy states (@) as was done by Ludwig Boltzmann i.e. eqn 11).

 

       S=k[In(@)]                     11)

 

   This forms the basis of Boltzmann’s interpretation of entropy (S)

 

 Next consider that entropy change is also given by:

 

            dS=dE/T                      14)

 

Combining 11) and 14) leads to

 

         d{k[In(@)]}=dE/T         15)

 

  We previously discussed that for the isobaric isothermal expansion of an ideal system that the number of accessible states does not necessarily increase. Next consider that we simply add thermal energy to a system, such as adding piece of hot iron into a cup of water. If the piece of iron is large enough and hot enough, then there will be a measurable temperature increase within that cup. Since there is a temperature increase, then this process cannot be isothermal. Hence you CANNOT write the isothermal relation eqn 14) i.e. dS=dE/T, wherein entropy is nothing more than an isothermal heat capacity.

 

Now ask: Does increasing the system’s thermal energy, increase the number of possible energy states? If the number of accessible energy states is strictly a function of volume, then the traditional answer may wrongly be no.  Conversely, if the number of accessible energy states is strictly a function of the system’s total energy, then the answer is yes.

 

 In this author’s way of thinking the number of accessible energy states should not be a function of both volume and energy, as traditional thermodynamics seemingly does when it applies eqn 11), 14), 15)  to both types of energy transfer.  And these are fundamental failures in traditional thermodynamics.

 

 Now there are those who might argue that eqn 14) is applicable if the amount of thermal energy supplied to the cup of water was small enough, i.e. infinitely small. The real truth is that such an argument is bogus based upon scale. If the heat added is so small that your thermometer cannot read the temperature increase then it is simply a case of your thermometer not being accurate enough, because the temperature went up even though the increase was immeasurable small, i.e. eqn 14) still does not apply. Either get a more accurate thermometer or a small cup of water.

 

   This all does not diminish Boltzmann’s great mathematical skills rather it does adds context to statistical thermodynamics.

 

  And least we forget. I am of the opinion that Boltzmann’s constant (k) makes S=kIn(@) equate to empirical findings here on Earth. In other words K may be a function of Earth’s gravitational field rather than a universal constant

                  

 

 My paper on Entropy:                       My paper on Second law:

 

 

 Copyright

Kent W. Mayhew 

Blog:Entropy: A Parameter Lacking Clarity
thermowebsite2023013.jpg thermowebsite2023010.jpg thermowebsite2023008.jpg thermowebsite2023005.jpg thermowebsite2023004.jpg thermowebsite2023003.jpg thermowebsite2023001.jpg
Help support this site