Talk:Introduction to entropy

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Two Questions[edit]

I have been sitting out because I was very happy that we got something into the first paragraph that answers the question for lay readers looking to learn what entropy, in the context of popular usage, means. That said, I have two questions for the presently active editors here:

1. Why is there a section about information entropy in an introduction to thermodynamic entropy? Should the article be moved back to Introduction to entropy?

2. Are you planning to eventually revise the article so that it is a non-technical introduction, either to thermodynamic entropy or entropy in general? Or maybe all of this work should be directed at the main entropy articles instead? Because except for certain sentences, this article is nowhere close to being a non-technical introduction to anything. Frankly, it is a disaster in that respect. Consider: "counterexamples may be included in the concept of 'dispersal' when the 'space' in which the dispersal occurs includes the space of quantum energy levels versus population numbers". If you think this is appropriate for a non-technical introduction article, then either you're working on the wrong article(s), or you're the wrong people to be working on this article. Non-technical means non-technical. Perhaps you have been so deep in physics for so long that a sentence like this appears non-technical and appropriate for a general audience. It isn't. I don't mean this as an insult, but rather as a wake-up call. I fear that the objective here has been lost a long time ago, and this needs to be pointed out.

Perhaps when the present editors are done deciding what goes into the article and where, I can go through it and draft a non-technical version, appropriate to a general audience, for your consideration. -Jordgette [talk] 23:51, 24 November 2020 (UTC)[reply]

  • In answer to question 1, its because the full picture of thermodynamic entropy cannot be seen without the understanding provided by information entropy. Would you rather simply have the description of how to measure thermodynamic entropy, say that it increases, and then walk away?
In answer to question 2, I agree, that sentence is too technical, and needs to be restated. This article is a hodgepodge of 10 or 20 editors, and I think at least we have a passable introduction. Please, if you can, simplify things where it is needed. PAR (talk) 02:14, 25 November 2020 (UTC)[reply]
  • In answer to question 1. 'Information entropy' is a form of words with a perhaps regrettable history. The word 'entropy' in 'information entropy' was arbitrarily stolen from physical conceptions of entropy. That theft was perhaps unfortunate, but to go beyond the everyday idea of 'spread', the ideas that underlie 'information entropy' are necessary. They include the idea that matter and radiation are composed of many microscopic material particles such as molecules, and of many photons. If you are happy to leave things just with the everyday idea of 'spread', then this article could forget about such things as molecules and photons, and forget about 'information entropy'. That would be a simplification.
Going further, perhaps, as you suggest, particularly bearing in mind our readers who are not interested in physics, the article could even forget about thermodynamic entropy. In that case, the article could scale down into your sentence "The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder."  Indeed, the article would then be practically an item for a dictionary of ordinary language.
Alternatively, the view that the article should be strongly non-technical could be abandoned. Instead, the article could attempt to explain the physics of entropy for readers who would like to learn a bit more than highscool physics.Chjoaygame (talk) 06:15, 25 November 2020 (UTC)[reply]
  • Remember that Wikipedia has many different articles on entropy covering different aspects. I continue to believe that an article with Introduction in the title should be a non-technical introduction, and only mention the advanced aspects briefly with links to appropriate articles for those who are interested. And an article with Thermodynamic in the title should be primarily about heat, work and energy at a macroscopic level. Disorder can be explained qualitatively with links to the more mathematical aspects, and information theory can be mentioned briefly with a link for more detail.
As for the statement that the term entropy in information theory was "stolen" from thermodynamics, I would prefer to say "copied". If I steal your property, you don't have it any more. If I copy your book or other intellectual property, you still have access to it and can use it. Thermodynamics still uses the term "entropy" even if it is also used elsewhere. Dirac66 (talk) 16:13, 25 November 2020 (UTC)[reply]
Editor Dirac66 is sober and tactful, as well as being wise. I think it relevant here that von Neumann gave, as reason for his suggestion, that no one knows what entropy is, and that it would consequently help Shannon in debates. I used the word 'stolen' in order to emphasise that unsavoury aspect of the history. With respect, I will defend it by noting that Editor Jordgette proposes to emphasise, by giving it a sentence in the most widely read spot of the article, the first paragraph of the lead, the regrettably traditional and misleading "disorder" interpretation, and that Editor Dirac66 still suggests it. Gresham's law at work. In such a pickle, distraction is near to theft of meaning. Breach of copyright is in some respects like theft. I defer, however, to Editor Dirac66's remark that the word can still be used in thermodynamics.
It is still open to Editors to change the title of the article. It is still open to Editors to restrict the main lines of the article to macroscopic thermodynamics, with a bare minimum of reference to the particulate aspect of thermodynamic bodies.
Editor Dirac66 proposes a distinction between non-technical and advanced aspects of the article. Perhaps he is suggesting that a microscopic account is 'advanced' and 'technical'? If, as he seems to suggest, a 'non-technical' account is primarily macroscopic, the new presence of 'thermodynamic' in the title would suggest that the case against a microscopic account would be strong. Displacing Clausius' prior 'disgregation' interpretation, the 'disorder' interpretation was invented on the basis of the microscopic account, and makes what little sense it has on that basis. On the other hand, the 'spread' interpretation makes sense for the macroscopic account; 'disgregation' is nearly synonymous with 'dispersal'.Chjoaygame (talk) 20:15, 25 November 2020 (UTC)[reply]
  • Thank you. We have something in the first paragraph to help out readers who are not interested in physics — that's great. The rest of the article should be for people who are interested in physics, but not (yet) versed in the jargon and mathematics — for example, a high schooler taking high-school physics, a financial analyst doing cross-disciplinary research, or a biologist studying the fundamental properties of living systems. Note that the main Entropy article links to this one as, "For a more accessible and less technical introduction to this topic...." So it should at least be more accessible and less technical. The disambiguation page links as "an explanation on entropy as a measure of irreversibility." Perhaps this article is biting off too much in trying to cover everything?
I'll be happy to help out with accessibility once the content and roadmap have reached, um, a kind of equilibrium. -Jordgette [talk] 16:57, 25 November 2020 (UTC)[reply]
  • This is the opinion of an electronic engineer. It's obvious that all of you have put a great deal of effort into improving this introduction, and it's pretty ingenious. Congrats! I can see I'm entering a very mature discussion. So this is just my impression, looking at the introduction as a nontechnical reader would:
I agree that the introduction should be made simpler for general readers. There will be middle school students, high school dropouts, English majors and overworked single mothers trying to get through community college coming to this article. The discussion of irreversibility and coffee is good. But I'm not sure it's possible to explain thermodynamic entropy to general readers without going into what it means at the particle level. That means introducing some statistical thermodynamics concepts as is done in paragraph 4, in particular the concept that entropy measures the number of "microstates" available to the system. And anyway that is the link to the common meaning of the word as "disorder". Maybe it makes sense to forget the distinction between thermodynamic and statistical entropy and move this article to Introduction to entropy. But I think the statistics needs to be dialed back in the intro; it should be limited to concepts and not include any mathematics. In particular, I don't think the digression into coin flipping in the last two paragraphs belongs in the introduction. --ChetvornoTALK 19:18, 25 November 2020 (UTC)[reply]
Thank you, Editor Chetvorno, for your comment. I agree that the 'disorder' interpretation relies on the particle account, while I add that the 'dispersal' interpretation works for both the bulk and the particle accounts.Chjoaygame (talk) 20:26, 25 November 2020 (UTC)[reply]
  • I agree with the concept of a non-technical introduction that will be meaningful to the layperson, and I appreciate all the hard work and thought that has gone into this article over the years. With that said, if the goal is to explain what entropy is, I think the probabilistic definition needs to be front and center (based on the definition ). At its most fundamental level, entropy is simply a description of how probable a particular state is. Entropy appears related to the concept of disorder simply because disordered states are more probable than ordered states: there are a lot more ways to arrange a pile of carbon, hydrogen, and oxygen atoms as a scrambled egg than there are to arrange those atoms as an intact, unscrambled egg. Irreversible processes are irreversible because when you switch (micro)states more or less at random, you're far more likely to end up in the more probable state: you expect to beat a scrambled egg and have the yolk and white spontaneously separate for the same reason you wouldn't expect to shake a shoebox with a hundred coins (or more realistically, a few trillion trillion coins) (it's a very big shoebox in this analogy) and have them all come up heads.
The common definitions in terms of disorder, unavailable energy, reversibility, etc. all stem from the fundamental nature of entropy as a measurement of probability, and I think the article would be both clearer and more accurate if that were our starting point. DrPippy (talk) 18:42, 28 November 2020 (UTC)[reply]
Thank you, Editor DrPippy, for your thoughts. In order to define the probability of a state, one first needs a definition of a state. A 'thermodynamic state' is a persistent object, one that is compatible with many distinct 'microscopic states'. Then one needs a definition of a microstate. It may be an instantaneous or a persistent object. It may itself consist of a single 'state' or of a class of single 'states'. And one needs a definition of a probability, an object that refers to a suitable sample 'space'. The sample 'space' then needs definition. Without such definitions, the interpretation of entropy in terms of probability is vague, or incomprehensible. To write out such definitions is to go beyond the ordinary bounds of the 'non-technical'.
I think it is loose to say that "disordered states are more probable than ordered states". In Boltzmann's formula, the states are equally probable. It is difficult to define a distinction between disordered and ordered states. I think that notions of "disorder, unavailable energy, reversibility" are interpretations, not definitions, of entropy.
In summary, I think that the matter is not too simple.Chjoaygame (talk) 20:15, 28 November 2020 (UTC)[reply]
You're right, of course, that a detailed explanation of microstates vs macrostates is likely to be too technical to be useful to the intended audience here. I think as a practical matter, when people talk about the "state" of a system in ordinary English, they mean the macrostate: e.g., temperature rather than position and velocity of each constituent molecule, etc. When I've talked about the second law/entropy/etc. in classes for non-science majors, I've generally found that even the relatively science-phobic students have a relatively intuitive sense of what I mean when I say "probability of a particular state", or talk about high-entropy states being more likely than low-entropy states. The traditional coin-tossing example—flip two coins, there's two ways to have a head and a tail, but only one way to have two heads or two tails, etc.—has generally seemed pretty accessible, and might be a useful illustration in this article. I've generally found it unnecessary to dive into a more detailed explanation of the technical concepts involved in order to introduce the basic idea of entropy, and it seems like we're in agreement that getting too far into the weeds here would be counterproductive.
I don't think that defining entropy in terms of probability is inherently more confusing than the other definitions. (In fact, it generally seems to clear up a lot of confusion about what entropy is.) More to the point, the other definitions of entropy are at best misleading, if not outright incorrect. DrPippy (talk) 21:58, 28 November 2020 (UTC)[reply]
Thank you for your further thoughts. You observe that “even the relatively science-phobic students have a relatively intuitive sense of what I mean when I say "probability of a particular state".” Without wishing to be personal, I guess that you are a persuasive teacher. You write “I don't think that defining entropy in terms of probability is inherently more confusing than the other definitions.” Further, you write “More to the point, the other definitions of entropy are at best misleading, if not outright incorrect.” Which other definitions? Are you intending to mean that Guggenheim is outright incorrect in his proposal of the 'spreading', 'dispersal', and 'accessibility' interpretation? And that Peter Atkins has let himself be led up the garden path by it? (By the way, I have no hesitation in thinking that Atkins allows himself grandiose delusions about the scope of the second law.) You haven't told us how the 'spreading' interpretation goes down in your classes. For me, the idea that heat spreads, and that diffusion is a kind of spreading, and that the expansion of a gas into a newly accessible space is a form of spreading, are intuitively attractive accounts. They make no appeal to probability, and marginal appeal to the molecular nature of matter. I think they have a fair claim to being 'non-technical'.
I think it important to distinguish macroscopic and microscopic viewpoints, and to distinguish definitions from interpretations. Boltzmann's so-called 'probability' interpretation refers not necessarily to probability, but necessarily to a number of 'states'. That is why Planck uses the qualified term 'thermodynamic probability', to refer to something nearer to the reciprocal of probability. Boltzmann's 'states' are said to be 'equiprobable'. From the microscopic viewpoint, I think it useful to distinguish 'probability' from 'extent of specification'. The latter names in other words what is customarily called 'quantity of information', defined by Shannon's function, in the topic of coding theory or combinatorics, also known as informatics. Some writers propose that 'probability' in this context is better interpreted in terms of informatics.
Blind application of the 'probability' interpretation has a problem in dealing with Poincaré's recurrence theorem.
There are two elements in the 'probability' account. 'That a process increases the probability of something.' It needs some explanation as to why a probability would change. The increase is due to new accessibility for spread. And 'that a macrostate's possible microstates are equiprobable.' Uniform spread comes to mind.Chjoaygame (talk) 02:29, 29 November 2020 (UTC)[reply]
  • I think you are over complicating things. This article is an introduction. It is the material I used to teach to chemistry students in the second year of an Australian or UK 3 year chemistry degree. You are just over complicating everything! --Bduke (talk) 09:06, 29 November 2020 (UTC)[reply]
  • I agree. We are going to have readers much less educated than college students coming here: middle school students, HS dropouts. A Wikipedia article also doesn't have as much space as a chemistry textbook chapter to get this stuff across. I guarantee you that English majors are going to have a hard time understanding what coin flipping has to do with stirring cream into coffee. We need to get much more simple and concrete. Diagrams would help a lot. Kittel p.47, fig. 2.9 has a series of simple diagrams of boxes of point particles showing things that increase entropy: adding particles, adding thermal energy, increasing the volume, decomposing molecules. --ChetvornoTALK 09:41, 29 November 2020 (UTC)[reply]
  • [Deleted unnecessary rambling] I ran across this article: https://www.space.com/43138-life-is-chaotic-entropy.html. I think it does a nice job explaining a physicist's view of entropy to a popular audience (including some of the intricacies of microstates vs. macrostates, etc.) without getting too bogged down in technical detail. Could we use something like this as a starting point? I'm happy to draft some text here if it'd be helpful to have something concrete to critique. DrPippy (talk) 15:22, 29 November 2020 (UTC)[reply]
I agree with the above three editors. It would benefit Wikipedia to have an article that shows how coin-flips and coffee-stirring have the concept of entropy in common, for readers who don't need to see the derivations of equations. But the desire for textbook rigor continues to make even the lead of this article unnecessarily opaque. Which is why I suggested that such intensive work go toward improving the non-introductory entropy articles instead.
A previous version of the lead used this example of billiard balls, and for the purposes of connecting probability to mixing, I would like to see it re-inserted:

Consider a billiards table with 15 balls on it. If we broadly observe that the balls are all lined up along one edge of the table, there is a certain finite number of combinations of the individual balls' locations that would be consistent with that broad observation of the balls being lined up in such a manner. However, if we broadly observe that the balls are spread out across the table (perhaps in a seemingly random arrangement), then there is a much higher number of combinations of the individual balls' locations that would be consistent with the broad observation that the balls are spread out. We say that the spread-out arrangement has high entropy, compared to the lined-up arrangement's low entropy.

-Jordgette [talk] 18:27, 30 November 2020 (UTC)[reply]
A nearly disqualifying aspect of the coffee-and-cream example, at least for me, is that usually I stir the cream into my coffee with a coffee spoon. The point of the second law is that the spread occurs spontaneously, without intervention by an animate agency, such as me and my spoon. Yes, if I waited all day without stirring, a deal of spreading would occur spontaneously, but my coffee would be cold by then, and so I would perhaps have thrown it out.Chjoaygame (talk) 19:48, 1 December 2020 (UTC)[reply]
You have a point. It will probably not be clear to newbies that the diffusion and increase of entropy will occur without stirring. The experiment does show the unidirectional nature of entropy, as no amount of stirring will return the coffee to the microstate it was originally in, with the coffee and cream separated. But it could be misleading. There are other examples without stirring, like if you leave a perfume bottle open the room will soon smell of perfume, or if you put an ice cube in a glass of still water it will come to a common temperature. --ChetvornoTALK 20:30, 1 December 2020 (UTC)[reply]

for ease of editing[edit]

For weeks, this discussion has gone in two (or more) very different directions. How are we going to resolve this? -Jordgette [talk] 18:27, 30 November 2020 (UTC)[reply]

Yeah, it sounds like the debate is bogged down in vague generalities. I think the only solution is to start proposing specific wording, as Jordgette did, either here or editing the article per WP:BRD. Then editors can say what they think about specific sentences and paragraphs, and consensus wording can be thrashed out. My personal feeling is that the introduction should be aimed at middle school students in the spirit of WP:ONEDOWN. So there should be no math, and the debate about proper terminology should be saved for the article body; if the word "disorder" in the intro helps teenagers understand, use it. I kind of like the general explanatory approach taken on the website [1] proposed by DrPippy, (if the unencyclopedic hip entertaining teen style and room cleaning similes are edited out). @DrPippy: if you want to write a version, go ahead. --ChetvornoTALK 21:43, 30 November 2020 (UTC)[reply]
Aw, but I like the hip entertaining room cleaning similes! (In all seriousness, I do think these sorts of analogies are by far the best way to communicate technical concepts to a non-technical audience, even if they're not completely encyclopedic.) I'm hashing some stuff out on my sandbox right now and I'll post something here when I've got something that might not be terrible. Would welcome any feedback and suggestions both before and after. DrPippy (talk) 21:49, 30 November 2020 (UTC)[reply]
Well I hear you but we do have standards (WP:EPSTYLE). Actually I'm not sure that any of these similes - billiard balls, coin flips, disorganized bedrooms - are going to help the low-level readers they are aimed at. A lot of nontechnical readers do not have any understanding of things on an atomic level - they are not aware that a cup of coffee sitting still is actually composed of particles moving randomly at high speed. Maybe it would be better to just explain what's going on without similes. I think diagrams would help more. A series of diagrams showing a box with red balls on one side and blue balls on the other diffusing into one another when the partition is lifted, would explain increase of entropy on an atomic scale. It would explain the coffee & cream example directly. Or better yet, an ANIMATION showing such diffusion! --ChetvornoTALK 23:06, 30 November 2020 (UTC)[reply]
Editor Chetvorno makes a good suggestion. His word "diffusion" is pretty close in meaning to the more general terms found in the literature, the 'disgregation' of Rudolf Clausius, and the 'spread', 'dispersal', and 'accessibility' of Edward A. Guggenheim. Guggenheim in 1949 wrote
Various attempts have been made to describe entropy in simple language. An unpublished fragment by W. GIBBS 5 suggests that entropy is a measure of the extent to which a system is mixed up.
Lord KELVIN is supposed to have suggested that entropy measures the extent to which a system is run down. Another suggestion will be made below, but before this is done it must be emphasized that whatever it may be that entropy measures for an isolated system, this same something is measured for a system in a thermostat by Massieu's function.
To the question what in one word does entropy really mean, the author would have no hesitation in replying 'Accessibility' or 'Spread'. When this picture of entropy is adopted, all mystery concerning the increasing property of entropy vanishes. The question whether, how and to what extent the entropy of a system can decrease finds an immediate answer.
...
In conclusion it may be pointed out that nowhere has the word 'probability' been used. The popular statement that a system has a natural tendency to change from less probable to more probable states is in the author's opinion stretching the word 'probable' beyond its recognized meaning.
It took some years till the 'spread' interpretation found its way widely. I don't know of an earlier or more authoritative expression of it than Guggenheim's. I would be glad of enlightenment on this point.Chjoaygame (talk) 00:51, 1 December 2020 (UTC)[reply]
I would love to take on the animation. There's a strong argument that such a visual representation will get the concept across more immediately and clearly than examples and similes made of words. -Jordgette [talk] 17:14, 1 December 2020 (UTC)[reply]
For the sake of unambiguity, with, I trust, the permission of Editor Chetvorno, I here quote precisely his above admirable words:
Maybe it would be better to just explain what's going on without similes. I think diagrams would help more. A series of diagrams showing a box with red balls on one side and blue balls on the other diffusing into one another when the partition is lifted, would explain increase of entropy on an atomic scale. ...  Or better yet, an ANIMATION showing such diffusion!
This is a simple, direct, and accurate account of the molecular view of the second law.Chjoaygame (talk) 17:34, 1 December 2020 (UTC)[reply]
It explicitly indicates the initial condition of systems separated by a partition, which is then a lifted so  as to produce a final state. For all we know, the animation after the lifting might run long enough to occasionally exhibit Poincaré recurrence!Chjoaygame (talk) 22:54, 1 December 2020 (UTC)[reply]
@Chjoaygame: Thanks, that's what I meant to say.--ChetvornoTALK 21:25, 1 December 2020 (UTC)[reply]
@Jordgette: Didn't mean to sound so negative above. I could live with the billiard ball example. You (as well as the other editors here) have experience. You clearly have ideas. I'd say go ahead and write the introduction you'd like to see, to get things started. --ChetvornoTALK 21:25, 1 December 2020 (UTC)[reply]
@Jordgette: If you can make a diffusion animation, that would be awesome! I've done some animations, just freehand GIFs drawn in SVG, but the ideal way to do this would be in MATLAB. I have just a tiny bit of experience with MATLAB, I'm nowhere near able to do that. --ChetvornoTALK 21:25, 1 December 2020 (UTC)[reply]

@Dr. Pippy - regarding the "messy room" versus the "orderly room", I think this discussion should go in the "Introductory descriptions of entropy" section, which lists convenient, but ultimately misleading explanations of entropy. The reasons for their helpfulness, and where they break down should be listed here. The order/disorder idea is breaks down when you realize that the vague concept of "order" must be defined before the concept can make any real sense. If I take a desk that looks very messy by the conventional standards and write down the location and orientation of every potato chip and crumpbled piece of paper and say "that is what I define as an orderly desk", then there are many possible clean desks that, by my definition, deviate from my ordered desk and are therefore disordered, and so the entropy of a conventionally ordered desk is larger. It's more a matter of how much information does it take to describe the stuff on the desk, not how "ordered" it is. For my definition of an ordered desk, it takes no extra information, for the "clean" desk, it takes a lot of information to express its deviation from my "ordered" desk, and even more to describe a conventionally messy desk other than my standard.

@Jordgette - The billiard ball example is another example of the order/disorder concept. You say that there are more ways for the balls to be spread out on the table than there are for them to be lined up. But if you take a photo of one case of the spread out balls, there are many ways for the balls to be different from that, including the one where they are lined up. You have defined "lined up" to mean order, but I have defined a particular "spread out" configuration to define order, and neither one of us is wrong. Entropy is a measure of how many ways there are to deviate from whatever you want to call an "ordered" situation. You say that putting them in a line is "ordered", as in y=ax+b. What if its not a first order polynomial, but carefully arranged according to a second, third, or fourth order polynomial? Then your linear order will be very disordered. The order/disorder ultimately requires a definition of order, and ultimately that translates into knowing where everything is, not that it appeals to some vague intuitive idea of order. Again, I think this belongs in the "Introductory descriptions of entropy" section where both the positive and negative aspects of these type of ideas are laid out.


@Chjoaygame - The idea that thermodynamic entropy is the spatial "spreading" of energy and/or mass is simply not sufficient. It assumes that the spatial degrees of freedom are the only ones that can be out of equilibrium. A microstate is represented by a single point in phase space, and the set of microstates (even the nonequilibrium ones) that a system can be in forms a subset of the phase space, or, equivalently, a "volume" in phase space. The logarithm of this volume is proportional to the entropy and is dominated by equilibrium states.

For example, if we have an ideal monatomic gas, fixed volume V and internal energy U, the phase space will have 6N dimensions, 3N spatial degrees of freedom, 3N momentum degrees of freedom. The positional part of the phase volume will be restricted by the fixed volume V and the momentum part will be restricted by the fixed energy with U=p2/2m. The equipartition theorem says that, at equilibrium, all degrees of freedom will contain, on average, an energy of kBT/2. In other words, the energy "spreads out" over all degrees of freedom (assuming they can share), subject to the above constraints. Saying that entropy is spatial spreading is saying that only the spatial degrees of freedom can be out of equilibrium, which then proceed to equilibrium.

In the entropy of mixing example, considering both gases on either side of a container, the phase space will have 12N dimensions, 6N for gas A, 6N for gas B. In the beginning, the phase volume will be separated into two parts, with gas A occupying only part of the phase volume assigned to the position of the A molecules, and gas B only occupying a part of the volume assigned to B. The idea that entropy is a spatial spreading of energy implicitly assumes that the momentum degrees of freedom are unimportant since they are already equilibrated by the fact of the common temperature. The energy will then "spread out" into the unoccupied positional degrees of freedom and entropy will increase. Spatial energy spreading occurs.

This need not be the case. We might just as well assume that the positional DOF's are already equilibrated and the momentum DOF's are not. For example, if we have a plasma, which is a gas at high enough temperature so that there are free electrons and ions, and we subject this plasma to an electric field, the electrons will acquire an "equilibrium" temperature which is higher than that of the ions, due to their much smaller mass. If the electric field is "turned off", the electrons and ions will thermally equilibrate to the same temperature and the entropy will increase. There will have been no spatial spreading of energy or mass. I know, the existence of the electric field implies an external "machine" (battery, capacitor, whatever) and it's not cut and dried, but the point is that in the instant after the electric field is turned off, the system is in a state that is practically the same as just before it was turned off, which shows that such a state is not physically impossible. We don't need the analog of a removable wall to illustrate the point.

As another example, we could postulate a diatomic ideal gas with rotational degrees of freedom. Suppose the spatial and momentum degrees of freedom are all at one equilibrium temperature and the rotational degrees of freedom at another. The rotational degrees of freedom will then equilibrate with the others, arriving at a common temperature and the entropy will increase. There will have been no spatial spreading of energy. I can't think of a real-world example of this situation, but that doesn't mean there isn't one, and besides, we don't need the analog of a removable wall. In the entropy of mixing, we don't need a removable wall, it's just a practical way of establishing the initial condition. We can simply postulate that there are two different gases occupying two sides of a container, both at the same temperature and pressure, and carry on from there.

What I am trying to say is that tying an increase in entropy to a spatial spreading of energy is a special case, based on a number of unnecessary assumptions and therefore does not give a true account of entropy. As a spread of energy in phase space, yes, but assuming that spread only involves the positional degrees of freedom, no. PAR (talk) 22:24, 1 December 2020 (UTC)[reply]

Agreed. Quite right. The rigorous account is about spread in phase 'space', not merely in ordinary physical space. The article should say so. It remains true that if ordinary spatial spread can occur because of new spatial accessibility, the second law says it will occur spontaneously. The spread in phase space is more conceptually advanced, and will require a suitable exposition in the advanced parts of the article.
Some small points. The nominal primary topic of the article is entropy. The definition of entropy is not the same kind of logical object as an account of a thermodynamic process such as is considered in the second law, which is subsidiary in the article. The removable wall is important, even essential, to give an adequate account of a thermodynamic process. The wall, and the thermodynamic operation that lifts it, are discussed in high quality textbooks. The wall is logically necessary, not just a practical way, for the thermodynamical equilibrium of the separated initial condition that occurs in the full statement of the law. A thermodynamic equilibrium has a practically infinite duration, from its past, or into its future, or is bounded by well time-separated thermodynamic operations; it is nothing like an instantaneous state. The zeroth law tells us that if two systems, each in its own respective internal thermodynamic equilibrium, are in thermal connection, they will have the same temperature. For a relevant initial condition exhibiting different temperatures, there needs to be a wall to prevent thermal connection. With respect, that you can't think of a real-world exception is telling; you are a good thinker. In finer detail, every instantaneous state that is eventually visited by the trajectory in phase space belongs to the thermodynamic equilibrium that defines the trajectory. The trajectory traverses a wild diversity of instantaneous states. That is why the Poincaré recurrence time is relevant. It tells how long it takes for a traversal of all those instantaneous states.Chjoaygame (talk) 23:52, 1 December 2020 (UTC)[reply]
Re the Poincare recurrence time - rethinking it, the Poincare recurrence idea has no place in classical thermodynamics. Classical thermodynamics is about systems in the the thermodynamic limit of infinitely large systems, in which the Poincare recurrence time is infinite - i.e. it never happens. PAR (talk) 11:45, 2 December 2020 (UTC)[reply]
There is a difference of viewpoint here. The proposition, that "classical thermodynamics is about systems in the thermodynamic limit of infinitely large systems", is, with respect, one that I find, shall I politely say, unacceptable. It comes from a mindset of statistical mechanics as the foundation, which I regard as foreign to classical thermodynamics as such. In my reading, classical thermodynamics is founded in finite macroscopic bodies such as are tested in experiments. Such bodies may be considered from a statistical mechanical viewpoint, as being composed of finitely many molecules. For statistical mechanics, it may be convenient to say that the number of molecules is so large as to be, for mathematical purposes, virtually infinite. I think that if, contrary to the classical thermodynamic view, the system were taken to be founded as an infinitely large body, then the notion of the Poincaré recurrence time would be nonsense. The state of thermodynamic equilibrium in classical thermodynamics has an infinite aspect, in its duration in time, which will far exceed the Poincaré recurrence time. For classical macroscopic thermodynamics, Poincaré recurrence, referring to instantaneous microscopic states, is, of course, foreign. But I think it makes good sense for the statistical mechanical analysis of a finite body.Chjoaygame (talk) 14:15, 2 December 2020 (UTC)[reply]
Regarding the need for a wall, or its equivalent, to separate various degrees of freedom, you may be right, but I am not convinced. For the mixing example, the system with the wall is described as two different gases in mutual equilibrium, but occupying different parts of the container. Is this description valid an infinitesimal time after the wall is removed? I am saying it is. Are you saying it is not? If it is a valid description of what amounts to a non-equilibrium state, then the fact that a wall produced it is irrelevant. Maybe this argument is insufficient to prove that it is possible for entropy to increase by spreading only into non-spatial degrees of freedom because the wall used to prepare the initial state has not been specified. However, I don't think a convincing argument has been made that, in principle, such a wall can only exist for spatial degrees of freedom, which is what you would need in order to show that entropy increase is unavoidably associated with a spatial spread of energy. PAR (talk) 11:45, 2 December 2020 (UTC)[reply]
Yes, the gases are still separate an infinitesimal time after the wall is removed. In that infinitesimal time, there are instantaneous microscopically specified states, but a state of thermodynamic equilibrium is not an instantaneous microscopically specified state. It is a long lasting macroscopically specified state. During its life, it can be described by a trajectory in microscopically specified phase space, each point of which is an instantaneous state.
I am very uncomfortable with your terms 'non-equilibrium state' and 'equilibrium state' here. We are here not interested in non-equilibrium states as macroscopic entities. For me, the term 'non-equilibrium state', as I read you, would refer to an instantaneous microscopic state that was constructed to belong to a trajectory that described a transient process, that is to say a process that started and ended with distinctly different defining or constitutive macroscopic parameters. Such a trajectory does not belong to a state of thermodynamic equilibrium such as we are studying here. A state of thermodynamic equilibrium has strictly persisting unchanging defining or constitutive macroscopic state variables, extensive or intensive as the case may be, as I have detailed above. Every point in a thermodynamic equilibrium trajectory belongs to its defining thermodynamic equilibrium, and no point in that trajectory specifies a 'non-equilibrium' microscopically specified instantaneous state. There are tracts which baffle themselves, and perhaps others, by mistaking this.
If the wall is immovable and impermeable, then 'mutual equilibrium' is the same as separate internal equilibria, and the word 'mutual' is practically redundant, or indeed misleading.
I think my position hasn't been made clear. The proper view is that 'spreading' is a description of the exploration of phase 'space' by the above-mentioned trajectory. I think you would find that acceptable? I agree with your point that such is not precisely, rigorously, and fully described by the phrases 'spread of energy in ordinary physical space' or 'spread of matter in ordinary physical space'. But it remains true that when a thermodynamic operation removes a partition, then mostly there will be spread of energy or matter in ordinary physical space. If it is desired to start the article with a full, rigorous, and precise statement, then I agree that such a statement must refer to spread in phase 'space', which I think is your point? But this article, as I understand the desires of our fellows colloquers, is primarily introductory and pedogogical. For this purpose, one may choose to start with a gentle approach, not full and rigorous. A full and rigorous account, referring to phase 'space', could follow deeper into the article, if such were desired. I find that the notions of spread of matter and energy in ordinary physical space seem useful for pedagogical introduction. I regard them as not misleading, and merely as lacking full detail and precision.Chjoaygame (talk) 14:23, 2 December 2020 (UTC)[reply]
I don't have a problem with the idea that entropy is very often characterized by a spatial spreading of energy or mass, but I would have a problem if it was implied that it is always characterized in this way. The idea that the bottom line essence of entropy is to be found in the spatial spreading of energy or mass should never be proposed, and it doesn't need to be proposed in order to keep things simple.PAR (talk) 17:34, 2 December 2020 (UTC)[reply]
I agree that "the bottom line essence of entropy is to be found in the spatial spreading of energy or mass should never be proposed, and it doesn't need to be proposed in order to keep things simple". I don't propose so. But I see it as helpful to use the words 'spread' and 'accessible' in an initial introduction, without asserting them as an explicit formal interpretation. In agreement with your view, an explicit formal interpretation should note that the proper bottom line idea refers to phase 'space'. I also agree with  Editor Chetvorno that the interpretative similes such as 'disorder' and 'spread' don't need explicit formal notice as distingished interpretations.Chjoaygame (talk) 23:57, 2 December 2020 (UTC)[reply]
Ok, I will put some words into the introduction to that effect and see how it flies. PAR (talk) 14:10, 3 December 2020 (UTC)[reply]
Ok.Chjoaygame (talk) 14:36, 3 December 2020 (UTC)[reply]

This is an article introducing entropy. Everything above should have no part in this article. It makes it too complicated. I have taught physical chemistry in several universities. The first year courses really do not say much about entropy. It is mostly introduced in a second year course called "Physical Chemistry". OK, this may not match the situation in the US exactly but it is my experience in UK and Australia. Almost everything above should not be included in such a course. It could be in the entropy article, but not in this introduction. --Bduke (talk) 00:16, 2 December 2020 (UTC)[reply]

I agree completely. The reason for the discussion, though, is to decide how to simplify things. Do we say, in simple terms, that entropy is the spreading of energy in physical space and put it in the introduction, or do we say that spatial spreading of energy is a helpful but ultimately flawed way of viewing entropy and put it in the "Introductory descriptions of entropy" section? PAR (talk)
I concur with the just foregoing remarks of Editor PAR, that the reason for the above discussion is to decide how to simplify things, and that the details of the above discussion are not intended to appear in the article.Chjoaygame (talk) 14:15, 2 December 2020 (UTC)[reply]

There seems to be two points of view, the idea that thermodynamics is fundamental and that statistical mechanics as explanation of thermodynamics is subordinate, and 2) Statistical mechanics is fundamental and thermodynamics is a special case of, and derived from, statistical mechanics. I don't think either can be "proven" to be correct. I found a very interesting web page at http://www.av8n.com/physics/thermo/ which takes the second point of view. The author (Denker) has given a lot of thought to the subject of thermodynamics and statistical mechanics and makes some very good points. He sees "entropy" as basically information entropy, and all things flow from that. He give a good criticism of the "dispersal" and "spreading" points of view, offering a number of examples in which they fail (See chapters 2.5 and 9.8). An example in which the dispersal idea fails is this: Consider two counter-rotating rings on a common axis, with equal but opposite angular momentum, separated by a small distance, each at, let's say, the same temperature. This is an equilibrium system. Now move the rings into contact with each other. Friction will slow and finally stop them. This will be an equilibrium system, both will be in contact, at rest, and at a higher temperature and entropy than the original two systems, yet there has been no spatial energy dispersal. PAR (talk) 18:10, 16 December 2020 (UTC)[reply]

Moving Forward With Improving This Article[edit]

After weeks of trying to be truly collaborative, my patience has expired. I apologize for once again coming off as rude, but moving forward, I will be working with editors @Chetvorno:, @DrPippy:, @Bduke:, and anyone else who can demonstrate a responsible and reader-empathetic understanding of the mission here: to provide an introductory explanation of entropy for laypersons. The discussion is verging on disruptive behavior, and again I think such energy would be better spent on improving the other entropy articles, rather than this introductory overview, which requires a special hand.

This article needs to be short, it needs to be in plain language, and there should be little or no math. It should provide a light survey of the various applications of entropy, with links to their respective articles. Some exacting technical rigor may need to be sacrificed to accomplish this mission. I am sorry, but it gets clearer by the day that the mission will not be accomplished otherwise. I daresay the consensus, among the new contributors combined with the long history of comments above, cries out for the article to be much more accessible and less technical and exacting.

I will proceed by preparing an animation, and I look forward to reading DrPippy's contributions. Also, a fair warning: Inappropriate or unwelcome responses in this section may be moved to an earlier section. I am over the walls of text and mathematical minutiae. The English major or the financial analyst we edit for doesn't care about degrees of freedom, or phase spaces, or . None of that is helpful here. We are going to move forward with improving this article and getting it to where it needs to be. -Jordgette [talk] 23:30, 1 December 2020 (UTC)[reply]

Okay, I've taken an initial crack at drafting an intro which (hopefully) strikes the right balance between encyclopedic tone and accessibility. I've tried to incorporate (1) a basic definition of entropy as an expression of the number of arrangements corresponding a particular state (without talking about multiplicity, microstates, or macrostates!); (2) an illustration of the second law; and (3) examples of irreversibility and disorder. I can think of many problems with my current draft (not the least of which is the lack of citations), but hopefully it's a decent starting point. So here's what I have at the moment:
In thermodynamics, entropy is a numerical quantity which describes the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement. For example, there are many ways to arrange the molecules of a gas inside its container that yield the same temperature and pressure; any two molecules could be swapped, but the overall state of the system would remain unchanged. If the gas were moved to a larger container, the number of possible arrangements of the constituent molecules would increase greatly; the entropy would therefore be larger as well.
The second law of thermodynamics is one of the foundational principles of physics; it states that entropy tends to increase over time. A gas which is initially confined to only the lower half of its container will quickly expand to fill the upper half of its container as well. From the standpoint of entropy, this is a result of the fact that there are many more ways to arrange the gas molecules in the entire container than there are to arrange them in only half of the container. Thus the entropy of the gas increases as it expands to fill the container. (Note: I think a figure here could be useful. Maybe a 2-molecule "gas", which would only have one way to have both molecules in the bottom half, but two ways to have one molecule in each half, implying that it's twice as likely for this "gas" to fill the container than to remain confined at the bottom. I can make a figure like this, but it won't happen until the weekend!)
The second law implies that some processes cannot be undone. The process of scrambling an egg increases its entropy, as there are many more ways to arrange the egg molecules in a scrambled state than there are to arrange those same molecules with an intact yolk separated from the white. The scrambling process essentially randomizes the positions of the egg molecules; once the egg is scrambled, it is practically impossible for continued stirring to cause all of the egg molecules to happen upon one of the relatively few arrangements where the yolk and white are separated, and so an egg cannot be unscrambled. (Note: I can imagine other analogies here that might be considered more encyclopedic; perhaps mixing two liquids, or lighting a match? I think this version works fine, but I'm not opposed if someone would prefer a different illustration.)
Entropy is often considered to be a measurement of disorder. This equivalence, while not always applicable, is generally practical because disordered states almost always have higher entropy than ordered states. There are relatively few ways to organize a deck of cards so that it is separated by suit compared to the number of arrangements where the suits are mixed together. There are relatively fewer ways to organize the objects in a room in a tidy fashion compared to the number of haphazard arrangements. There are fewer ways to arrange pieces of glass into an intact window than a shattered window. A shuffled deck, a messy room, and a broken window all possess more entropy than their more well-ordered counterparts. The second law therefore implies that it is relatively easy for a highly-ordered system to become disorganized, but much more difficult to bring order to a disorganized system.
I'd love to hear any feedback; feel free to edit this as mercilessly as necessary! DrPippy (talk) 00:28, 2 December 2020 (UTC)[reply]
Great start; thank you. Do you think the article should be moved back (renamed) to Introduction to entropy? Because I think it would be great to have a general "introduction to entropy" article that touches on all of its aspects and interpretations, and can serve as a portal to the various non-introductory entropy articles. Perhaps "Introduction to thermodynamic entropy" is too specific for an introductory article. If I were that English major or financial analyst, I might stumble on the first two words being "in thermodynamics" (they might well not know what thermodynamics is).
As I've mentioned a bunch of times on this page, I feel strongly that the first paragraph should mention the popular meaning of the word, as the current version does: The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. Many people come to this article having seen the word in that context, and the article needs to help them. If we were to move this article back to Introduction to entropy, perhaps the first paragraph can mention that entropy has various subtly different applications in various disciplines, and that the word has also entered popular usage. Then the 2nd paragraph can launch into the thermodynamic account, and we're off to the races. -Jordgette [talk] 17:10, 2 December 2020 (UTC)[reply]
I originally thought yes, the article should be renamed Introduction to entropy; now I'm not too sure. The problem is in physics and thermodynamics entropy seems to be part of natural science: it has a clear definition as the number of microstates corresponding to a state, and natural consequences; it controls the direction of spontaneous processes seen in nature. The wider applications of entropy appear more artificial. It will be hard to explain to general readers the connection of the thermodynamic definition to the entropy of a text message, or an image, or a computer program, or a chess position. I think it should be renamed, but most of the article should still concentrate on the thermodynamic application. --ChetvornoTALK 19:22, 2 December 2020 (UTC)[reply]
I agree with Jordgette above the common meaning of entropy as "disorder" could be earlier in the intro. --ChetvornoTALK 19:36, 2 December 2020 (UTC)[reply]
It does make sense to move the disorder stuff earlier. I'd be uncomfortable with this in the lead; since it follows from the multiplicity definition rather than the other way around, I think you need to have the multiplicity definition under your belt to understand why entropy can be understood in terms of disorder. Would it be okay to keep the first paragraph more or less as is, but then immediately segue into the disorder paragraph? I think I can move things around pretty easily.
If I make changes to the original draft, is it okay to overwrite the original draft (just so the talk page doesn't get too long)? Or would it be better to post the whole thing from scratch? Either way, I can take another crack at this later tonight or tomorrow.
Lastly, I agree with renaming the article Introduction to entropy. The word "thermodynamic" is an unnecessary barrier to entry for the non-technical reader, and when the concept of entropy comes up in the context of a popular audience, I feel like it's almost always talking about the thermodynamic sense. Might not be a bad idea to include a (short) section on entropy in other contexts where you could look at Shannon entropy, etc., but I would actually be okay excising that material from this article. DrPippy (talk) 21:42, 2 December 2020 (UTC)[reply]
I guess my point is that if we change the name we can't excise it completely, the article will have to at least mention all the types of entropy. But maybe the more abstract types should be included WP:SUMMARYSTYLE, in brief paragraphs with links. I think we should resist the tendency of statistical thermodynamicists to include a lot of mathematics, which will simply not be understood by 99% of readers coming to this introductory article. --ChetvornoTALK 22:23, 2 December 2020 (UTC)[reply]
Much preferable to post the whole thing from scratch. Changes made to a talk page post can easily make nonsense of another editor's  response to the initial version. It is talk-page convention to allow for that. If there has been no response, changes are usually reasonable.Chjoaygame (talk) 22:04, 2 December 2020 (UTC)[reply]
Agree that it could be better to go back to the  title Introduction to entropy, and with the related suggestions. Reservation: as to the general structure of the article, I continue to agree with Editor PAR, that the logic of the article should be primarily grounded in thermodynamic entropy. There are valuable generalisations or extensions of the notion of entropy in statistical mechanics, mathematics,  and informatics, and I see it as probably a good idea to indicate them in the article.Chjoaygame (talk) 22:41, 2 December 2020 (UTC)[reply]

first revision of DrPippy's draft[edit]

In thermodynamics, entropy is a measure of the possible diversity of arrangements of the constituent parts of a body of matter and radiation. For example, there are many ways in which the molecules of a body of gas can arrange themselves inside its container with one and the same temperature and pressure; any two molecules can be swapped while the overall state remains unchanged. If the gas is allowed to spread into a larger container, the number of accessible arrangements of the molecules increases greatly; the entropy increases to indicate that.

The second law of thermodynamics is well established. Beyond the just-mentioned example, another is in two different gases that are initially separated into two compartments by a partition in a container. When the partition is removed, the gases spread themselves throughout the container. The total entropy increases in this process because there are many more ways in which the different gas molecules can arrange themselves in the entire container than in the separated compartments. (Note: this is the place for the animation mentioned above, as proposed by Editor Chetvorna. Such an animation would truly earn this article a place in Wikipedia. Anyone?)

More generally, the second law says that natural spontaneous processes do not undo themselves. When a match is lit, it spreads light, heat, and matter. It cannot unlight itself. To reconstitute a fresh match requires an elaborate manufacturing process, which could hardly recapture the scattered matter of the original match.Chjoaygame (talk) 01:50, 2 December 2020 (UTC)[reply]

This version seems inferior to me. The crucial fact that entropy is a numerical measure is not mentioned. The word "diversity" seems out of place. "The Second law of thermodynamics is well established." is totally without context, it isn't mentioned again until the next para., and the central point that the 2nd law says entropy can never decrease in a spontaneous process is omitted. The description of the match example makes it sound like a reduction of entropy can only occur in a manufacturing process. --ChetvornoTALK 19:59, 2 December 2020 (UTC)[reply]
Thank you for your comment. Action taken. The 'never decreases' version refers to the trivial case of a "process" that doesn't actually proceed, or to the imaginary or fictive theoretical limiting case of a reversible "process". Those special, unnatural, or highly technical cases unnecessarily distract from the main fact expressed in the second law, that in a spontaneous process, entropy always increases. I like the word 'diversity' there.Chjoaygame (talk) 22:18, 2 December 2020 (UTC)[reply]

update[edit]

In thermodynamics, the entropy of a body of matter and radiation is a physical quantity that measures the spread or accessible range of distribution of motions of its microscopic elementary components. For example, there are many different ways for the molecules of a body of gas to move around inside its container, with one and the same temperature and pressure. If the gas is allowed to expand into a larger container, the range of accessible different motions of the molecules increases; that is measured as a numerical increase in entropy.

Such an increase of entropy illustrates an important physical principle, the Second law of thermodynamics. Another simple example of increase of entropy according to the law is when, into a body of gas in a container, heat is allowed to spread from its surroundings, giving its molecules a greater accessible range of speeds of motion and rotation. Another process described by the law is in two different gases that are initially separated into two compartments by a partition in a container. When the partition is removed, so that a greater volume becomes accessible for each of them, the molecules of the gases spread themselves to move throughout it. In this process, the total entropy increases because more ways have become accessible for the different gas molecules to move around in the entire container than were accessible in the separated compartments. (Note: this is the place for the animation mentioned above, as proposed by Editor Chetvorna. Such an animation would truly earn this article a place in Wikipedia. Anyone?)

More generally, the second law says that natural spontaneous processes of interaction between bodies do not undo themselves, and that this is always evident in an increase of entropy. In contrast, for a decrease of the entropy of a body, an organised expenditure of energy is necessary. For example, when a match is lit, it consumes oxygen and spreads light, heat, and matter with a corresponding increase of entropy. It cannot unlight itself. To reconstitute a fresh match requires an organised process, which could hardly recapture the scattered matter of the original match.

The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder.Chjoaygame (talk) 12:58, 3 December 2020 (UTC)[reply]

New draft of introductory paragraphs[edit]

Here's my latest draft:

In popular usage, entropy is often considered to be a measurement of disorder. This usage stems from the physical definition of entropy as a numerical quantity which describes the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement. For example, there are many ways to arrange the molecules of a gas inside its container that yield the same temperature and pressure; any two molecules could be swapped, but the overall state of the system would remain unchanged. If the gas were moved to a larger container, the number of possible arrangements of the constituent molecules would increase greatly; the entropy would therefore be larger as well.
The equivalence between entropy and disorder arises because disordered states almost always have higher entropy than ordered states. There are relatively few ways to organize a deck of cards so that it is separated by suit compared to the number of arrangements where the suits are mixed together. There are relatively fewer ways to organize the objects in a room in a tidy fashion compared to the number of haphazard arrangements. There are fewer ways to arrange pieces of glass into an intact window than a shattered window. A shuffled deck, a messy room, and a broken window all possess more entropy than their more well-ordered counterparts.
The second law of thermodynamics is one of the foundational principles of physics; it states that entropy tends to increase over time. A gas which is initially confined to only the lower half of its container will quickly expand to fill the upper half of its container as well. From the standpoint of entropy, this is a result of the fact that there are many more ways to arrange the gas molecules in the entire container than there are to arrange them in only half of the container. Thus the entropy of the gas increases as it expands to fill the container. (Note: I think a figure here could be useful. Maybe a 2-molecule "gas", which would only have one way to have both molecules in the bottom half, but two ways to have one molecule in each half, implying that it's twice as likely for this "gas" to fill the container than to remain confined at the bottom. I can make a figure like this, but it won't happen until the weekend!)
The second law, taken in combination with the relationship between entropy and disorder, implies that it is relatively easy for a highly-ordered system to become disorganized, but much more difficult to bring order to a disorganized system. Furthermore, some processes cannot be undone at all. The process of scrambling an egg increases its entropy, as there are many more ways to arrange the egg molecules in a scrambled state than there are to arrange those same molecules with an intact yolk separated from the white. The scrambling process essentially randomizes the positions of the egg molecules; once the egg is scrambled, it is practically impossible for continued stirring to cause all of the egg molecules to happen upon one of the relatively few arrangements where the yolk and white are separated, and so an egg cannot be unscrambled.

This addresses the popular view of entropy = disorder right off the bat, but also keeps the physical definition in the first paragraph. Not sure about the overall flow, but I think we're moving in the right direction! DrPippy (talk) 22:36, 2 December 2020 (UTC)[reply]

Very unhappy to see the traditional but confusing 'disorder' interpretation being proposed to overwhelmingly dominate the article. The tradition is a major reason why entropy puzzles people.
It would be a retrograde step to help it continue to overwhelmingly dominate, when the 'spread' interpretation is so much easier and more intelligible, and is now also mainstream. There is good literature objection and criticism of the confusing tradition. The article would probably be unnecessary if the regrettable 'disorder' interpretation hadn't taken on. Editor DrPippy didn't answer my question about how he had found the 'spread' interpretation in the empiric of teaching.
It is unhappy, even misleading, or even mischievous, to say that entropy "tends" to increase over time. It increases in every non-trivial thermodynamic process. Better not to fudge that in the vague word "tends". Thermodynamic entropy refers to discrete states of thermodynamic equilibrium. It is not a continuously time varying quantity, and it is poor pedagogy to allow a hint to the contrary. Boltzmann's function can be regarded as an 'entropy' only in a derivative or highly technical sense. Thermodynamic entropy gets its character from the unique symmetry of thermodynamic equilibrium, which is a strictly unchanging state. Phil Attard in Sydney is proposing a proper extension of the concept of entropy to non-equilibrium conditions, but that is a project for the future, far outside the scope of this article.
The process of scrambling an egg is an unhappy instance to use as a prime example of the second law, because scrambling is not a simple example of a natural or spontaneous process; it implies a personal stirrer. Talk of stirring and manual arranging of molecules distracts from the key to the second law, that the molecules naturally rearrange themselves. A boiled egg is a simpler but more subtle example, because it does not call upon a person to scramble the white and the yolk; it is not very suitable for our purpose. I dislike the words "essentially randomizes". They are vague and allude to a bundle of distracting and dubiously relevant concepts.Chjoaygame (talk) 23:37, 2 December 2020 (UTC)[reply]

welcoming[edit]

Great to see we are making some solid progress. I think the first paragraph should be welcoming and give the reader a chance to settle in before launching into the physics. This is what I'd propose:

Entropy is an important concept in physics, specifically the field of thermodynamics, and in information theory. The word has also entered popular usage as a measurement of the disorder of a system, or to refer a lack of order or predictability, or of a gradual decline into disorder.[citation] A more physical interpretation of entropy refers to the spread of energy or matter in a system. Entropy shows that many physical processes can go in only one direction in time — for example, even in a perfectly sealed room, a puff of smoke will dissipate into the air, but it will never reform into a concentrated puff. To understand how the physical interpretation led to its interpretation in information theory (as well as its popular interpretation), it helps to consider the thermodynamic origins of the term and how they tie into the field of physics known as statistical mechanics.

I'd also like to make the unspeakable proposal that the lead section of this article contain only one paragraph. It seems difficult, if not impossible, to provide a gloss of entropy's various interpretations that is appropriate for a lead section; it gets weedy very quickly, and that's intimidating for a lay reader who has not even reached the body of the article. I further propose that after such a bare-bones lead paragraph, the body of the article begins with the first section, Entropy in thermodynamics...something similar to thermodynamics-focused paragraph below.

The animation is coming along. It's essentially 200 boxes, with 100 red balls and 100 blue balls, and I'm using a random number table to determine the motion of each ball from frame to frame. It would have been easier to write a program, but unfortunately I have to do it manually because I don't have those skills. But I'm doing it in Blender, so it should look pretty good. -Jordgette [talk] 17:29, 3 December 2020 (UTC)[reply]

  • A few suggestions:
In popular usage, the word 'entropy' means lack of order or predictability, or gradual decline into disorder. The word originated in physics, in the field of thermodynamics, where it refers to the natural spontaneous spread of matter and radiation, or of energy, into all the the diverse forms of motion and places to which they have access. For example, in a sealed room, a puff of smoke will irreversibly disperse, and will not concentrate itself again. Though it occurs according to the laws of physics for microscopic particles, in order to make calculations about such dispersal, it has helped to regard it as random or probabilistic, in a branch of physics known as statistical mechanics. In turn, this has been studied in engineering terms, in information theory. This led to highly mathematical analysis of the evolutions of various motions.
For accuracy as an illustration of the second law of thermodynamics, an animation will show first the equilibrium of motions in two separate compartments, followed by the removal of the partition, leading to the spreads of the different coloured balls into a new equilibrium of motion.Chjoaygame (talk) 19:26, 3 December 2020 (UTC)[reply]
  • I don't care for the explanation of entropy in terms of "spread" of energy or matter; this is one consequence of the second law, but I'm doubtful that it's the most important one. I don't believe that using this concept as the basis for the explanation of entropy is a standard approach (even in a popular definition), nor well-supported by the physics. I'd have a hard time getting behind any definition that doesn't start from the idea of looking at the number of microscopic arrangements that lead to the same large-scale state, as this is really the basis for all the other ways of thinking about (physical) entropy, whether you prefer disorder, dispersal/diffusion, irreversibility, or anything else. DrPippy (talk) 18:39, 3 December 2020 (UTC)[reply]
I am sorry to see Editor DrPippy feeling so. Entropy is about motion, and to talk just about 'arrangements' misses much of the nature of entropy. As for support in thermodynamics and chemistry, it would be hard to find more authoritative sources than Edward A. Guggenheim and Peter Atkins. Nowadays, there are many textbooks that use the 'spread' language because their authors find it helps for teaching beginners, which is closely relevant for this article. Again, it would be hard to find more authoritative sources of reasoned physical objection to the more traditional 'disorder' view than Edwin Thompson Jaynes and Walter T. Grandy, Jr, on the grounds that it is obscure and baffling.Chjoaygame (talk) 19:52, 3 December 2020 (UTC)[reply]
  • I don't have strong opinions about number of paragraphs in the lede, or how those paragraphs are organized, but I do feel very strongly that the first paragraph needs to have some appropriately-wordsmithed version of "a numerical quantity which describes the number of different ways that the constituent parts of a system can be arranged to get the same overall configuration." Otherwise, you're really talking about the effects of the second law, rather than what entropy is. It's great to be non-technical, and if that means we sacrifice some degree of precision in terms of the nitty-gritty details then that's a sacrifice well worth making. But I don't think this means we should fail to provide a non-mathematical but physically accurate definition of entropy in the first paragraph, and none of the alternatives that I've seen so far do this. DrPippy (talk) 18:39, 3 December 2020 (UTC)[reply]
  • A physically accurate picture of physical entropy requires more than an account of configurational arrangement: it requires also an account of motion. That is why phase space is used, rather than merely configuration space. Guggenheim's 'spread' and 'dispersal' are good ordinary language words to express entropic motion.Chjoaygame (talk) 23:10, 3 December 2020 (UTC)[reply]
  • I hear you. I am doing what I can to prevent readers from being scared off by the first paragraph, which is the monumental challenge here. What do you think of this version:

Entropy is an important concept in physics, specifically the field of thermodynamics, and in information theory. The word has also entered popular usage as a measurement of the disorder of a system, or to refer a lack of order or predictability, or of a gradual decline into disorder.[citation] As a numerical measure, entropy describes the number of ways that components of a system (such as the molecules of a gas) can be arranged in order to result in the same overall configuration. If such a system is allowed to change without any external influences, the number of those arrangements — i.e., the system's entropy — invariably goes up with time, never down. This fact is captured in an important law of physics known as the second law of thermodynamics. A consequence of the second law of thermodynamics is that many physical processes can go in only one direction in time. For example, even in a perfectly sealed room, a puff of smoke will dissipate into the air, but it will never reform into a concentrated puff. The closed system of air-plus-smoke begins with low entropy, and it ends with high entropy: the smoke is evenly distributed throughout the room, and at that point (known as thermodynamic equilibrium), any random movement of air or smoke molecules will still result in the smoke being equally distributed. At equilibrium, the system's entropy has reached its maximum value.

-Jordgette [talk] 19:29, 3 December 2020 (UTC)[reply]
Spontaneous and diverse motion is crucial to physical entropy. To say that 'components can be arranged' obscures this. The components move spontaneously, without waiting passively for an agency to arrange them. It is true that counting numbers of ways of arrangement is helpful as a mathematical approximation, but it misses motion as a key aspect of entropy.Chjoaygame (talk) 20:04, 3 December 2020 (UTC)[reply]
The use of arrangements as distinct from motions is a mathematical device, and is short on physical content. One of our principles here is to avoid mathematics. Our interest is in physics.Chjoaygame (talk) 22:12, 3 December 2020 (UTC)[reply]
The importance of diversity of motion has been carefully brought to our attention by Editor PAR, when he has emphasised that entropic spread is in phase space. The words 'diversity of motion' render motion in phase space in ordinary non-technical language.Chjoaygame (talk) 21:23, 3 December 2020 (UTC)[reply]
The words "evenly distributed" miss the diversity of entropic motion. Guggenheim's word 'spread' better expresses that diversity.
To be more explicit than "many physical processes", one could say something such as 'minglings amongst the components of bodies of matter and radiation'. In contrast, such processes as planetary orbiting are not very helpful illustrations of the second law.Chjoaygame (talk) 22:32, 3 December 2020 (UTC)[reply]
The clause "Entropy is an important concept in physics, ..." is hardly necessary. People who read or write in this article will mostly already have decided that entropy is an important topic. Indeed, we mostly think that everything in Wikipedia is important. The space taken up by that clause could probably be better used.Chjoaygame (talk) 06:37, 4 December 2020 (UTC)[reply]
I object strongly to a phrase like "As a numerical measure, entropy describes the number of ways that components of a system (such as the molecules of a gas) can be arranged in order to result in the same overall configuration." This is NOT the definition of THERMODYNAMIC entropy. This is a description of INFORMATION entropy as applied to EXPLAIN thermodynamic entropy. To blur the difference between the two is a disservice to the reader, no matter how new they are to the subject. It does not simplify things, it outright confuses the issue. The idea that thermodynamic entropy and information entropy are distinct is not rocket science. The change in thermodynamic entropy is a measured macroscopic experimental quantity that has nothing to do with atoms, molecules, microstates, macrostates, probabilities or any of those statistical mechanical concepts. The laws of thermodynamics are experimentally observed relationships between these macroscopic measurements, again with no reference to statistical mechanics. Information entropy is a mathematical construct which Boltzmann used, along with a number of ASSUMPTIONS, unverified experimentally, to very successfully EXPLAIN thermodynamic entropy and the laws of thermodynamics. Information entropy is NOT thermodynamic entropy and to purposely obfuscate the difference in the name of simplicity is wrong. The distinction is simple, to blur it does not produce simplicity, it produces an outright falsehood. The new user who does not pursue the subject walks away with a false idea of thermodynamic entropy, and those who do pursue it will eventually discard this article as so much hogwash. PAR (talk) 03:40, 5 December 2020 (UTC)[reply]
Broadly speaking, I agree with this comment by Editor PAR. We may observe that it was written with the article title 'Introduction to thermodynamic entropy' in mind. Because of the complexity of the problem, 'information entropy' is more a program for research than a complete physical account.Chjoaygame (talk) 11:53, 5 December 2020 (UTC)[reply]
Likewise a phrase like "The word originated in physics, in the field of thermodynamics, where it refers to the natural spontaneous spread of matter and radiation, or of energy, into all the the diverse forms of motion and places to which they have access." This is a blatantly false statement. Thermodynamic entropy refers to a measurement involving thermodynamic parameters, and it's magic is that it is a state variable. The change in entropy between A and B is independent of the path taken to get there. Now the struggle begins to intuitively understand it. "disorder", "spreading", these are all initially useful but ultimately incorrect ways of characterizing entropy, "disorder" being the worst offender, because it refers to the information entropy interpretation. "Spreading" at least deals with effects very often (but not always) associated with a change in thermodynamic entropy. The essential experience of thermodynamic entropy is irreversibility. This is what the second law says and that is ALL it says, and that is a simple concept to illustrate. Thermodynamics does not presume to explain it, only to define it and characterize it via the second law, and statistical mechanics does not presume to define it or characterize it, only to explain it and to explain the second law. PAR (talk) 03:56, 5 December 2020 (UTC)[reply]
I am replying to Editor PAR's above comment referring not to the draft by Editor Jordgette that leads this section 'welcoming', but to my offering above under the line ″A few suggestions″. I feel I need to defend myself from the charge made by Editor PAR in his comment. He writes ″Likewise a phrase like "The word originated in physics, in the field of thermodynamics, where it refers to the natural spontaneous spread of matter and radiation, or of energy, into all the the diverse forms of motion and places to which they have access." This is a blatantly false statement.″ Editor PAR writes as if I had written something such as 'thermodynamic entropy is spread'. No, I wrote that the word refers to spread; of course, PAR is right in objecting that 'spread' is not the thermodynamic definition of entropy. I have written many times above that 'spread' is an interpretation, intending to make it clear that an interpretation is not a definition. I think that what I wrote is perfectly consistent with Editor PAR's sentence ″Thermodynamic entropy refers to a measurement involving thermodynamic parameters, and it's magic is that it is a state variable.″ For me, there is a big difference between saying what a word refers to and saying what is its definition, or what it is. I am deliberate in using the word 'refers'.
In more detail, I am sorry to see Editor PAR giving less credit to the 'spread' interpretation than I think it deserves. Editor PAR has repeatedly, and in my view rightly and importantly, said that if the word 'spread' is to be used in this context, then it should properly include 'spread' of instantaneous states in microscopic phase space, and should not be limited to spread in ordinary physical space. Intended to say just that in ordinary language are my above words ″into all diverse forms of motion and places to which they have access″. In particular, my words ″all diverse forms of motion″. If those words of mine fail to convey, in ordinary language, what Editor PAR is referring to, then I have failed to adequately articulate my thoughts. I intend by those words to avoid the technicality of phase space, which I think it is not convenient to name explicitly or define in the very most introductory part of the article. My understanding of Editor PAR's right insistence on considering the full depth or intricacy of the phase space description is, to the best of my power, indicated by the words ″diverse forms of motion.″ Perhaps some more helpful rendering of this important idea into ordinary language may come forth for us.Chjoaygame (talk) 11:37, 5 December 2020 (UTC)[reply]
I don't like the sentence ″At equilibrium, the system's entropy has reached its maximum value.″ It can too easily be read to suggest that entropy grows gently towards thermodynamic equilibrium. This belies the discrete character of thermodynamic processes, and underplays the fine precision, or in Editor PAR's word, 'magic', in the quantity 'entropy'. Moreover, and more concerningly, it slips towards a vague notion of 'maximum'. Maximum with respect to what range of possibilities?Chjoaygame (talk) 12:28, 5 December 2020 (UTC)[reply]
  • I think it worth noticing that Editor Jordgette is using a random number table to determine the motion. My point is that he is using a mathematical table as a convenient way of modeling the system, instead of using a calculation that fully implements the laws of motion. In other words, he is virtually using a probability approach as a mathematical trick, not as a really physical account. I am not objecting to his doing so, just pointing out the difference between probability theory and physics that was noted by Guggenheim.Chjoaygame (talk) 12:15, 5 December 2020 (UTC)[reply]

I'm going to be honest; I think I really prefer the approach in my (second) version. (Definite need for wordsmithing, and probably could be a little clearer in places.) I think that most or all of the issues that Chjoaygame has raised are red herrings in terms of the purpose of this article (although certainly interesting from the standpoint of philosophy of science.

I think the article (at least the intro) really needs to do only one thing: it needs to connect the popular conception of entropy (disorder/chaos/decay, although maybe this is shaped by me playing too many videogames) with the scientific one. The physical reason that the second law leads to greater disorder is that there are (vastly) more disordered states than ordered states; similar arguments apply to irreversibility, diffusion in physical or phase space, etc. The only requirement is that there needs to be some mechanism causing the system to randomize its (micro)state; the laws of probability and the Central Limit Theorem take care of the rest. Whether that mechanism is a person with a spoon stirring cream into a cup of coffee, the thermal motion of constituent molecules, or the machinations of the Flying Spaghetti Monster and His Noodly Appendage is not important to the very narrow purpose of explaining, in layman's terms, what entropy is, and how that idea helps us to understand the world around us.

As to the spread interpretation, two points: (1) it's a consequence of the nature of entropy and the second law, not a description or explanation of it; (2) this interpretation of entropy hasn't captured popular attention the way the disorder interpretation has. We need to work with what the popular interpretation is, not what we want it to be. (FWIW, I don't think either disorder or spread is a good model for the fundamental nature of entropy; as I've said before, entropy is fundamentally defined in terms of probability.)

I think we need to be very careful to avoid unnecessary complexity; there's a time and place to take a look under the hood and think about what's really going on, but almost never when you're introducing a new idea. DrPippy (talk) 13:08, 5 December 2020 (UTC)[reply]

I just saw PAR's objection a few paragraphs above; unfortunately it's a bit hard to keep track of everything that's happening on this talk page!
Not to be overly blunt, but I am in complete disagreement with his point. The statistical physics definition of entropy (Boltzmann/Gibbs; the two are essentially equivalent in an isolated system) is (1) a physical definition of entropy, predating and unrelated to informational entropy; (2) can be shown to be equivalent to the thermodynamic interpretation dS=dQ/T (see the math-box in Entropy_(statistical_thermodynamics)); and (3) offers a relatively clear explanation for related phenomena in a way that the macro-scale thermodynamic equivalent does not. For the purposes of this article, this third point is the most important one.
Is it possible this is a physics/chemistry disagreement? My experience is that physicists will start with a definition of entropy based on microscopic arrangements (k ln W), whereas the chemists find more use in the macro-scale thermodynamic interpretation (dQ/T). In either case, I would generally argue that in general the macroscopic laws used in physical chemistry are a result of the statistical mechanics of large collections of microscopic particles, and therefore the interpretation of entropy given by statistical physics is more fundamental. The sentence that PAR is objecting to is my best attempt to put this interpretation into non-technical language. I'm very open to improved wording, but I'd argue strenuously against using a different interpretation. DrPippy (talk) 13:08, 5 December 2020 (UTC)[reply]

Proposed move to Introduction to entropy[edit]

@PAR: @Jordgette: @DrPippy: @Chjoaygame: Re. PAR's comment in previous section: We talked above about moving the article to Introduction to entropy, thus including information entropy. This seemed to be favored by a majority of you (and me). Maybe this should be decided before going on, since it will have an impact on the introduction. Shall we move the article?--ChetvornoTALK 05:38, 5 December 2020 (UTC)[reply]

Agree as above --Bduke (talk) 06:20, 5 December 2020 (UTC)[reply]
Agree --ChetvornoTALK 07:37, 5 December 2020 (UTC)[reply]
Indifferent --Chjoaygame (talk) 10:14, 5 December 2020 (UTC)[reply]
Agree DrPippy (talk) 12:16, 5 December 2020 (UTC)[reply]
Agree ---Jordgette [talk] 15:22, 5 December 2020 (UTC)[reply]
 Done Consensus, performed move --ChetvornoTALK 21:05, 6 December 2020 (UTC)[reply]
The archive of the talk page is yet to be  moved.Chjoaygame (talk) 00:24, 7 December 2020 (UTC)[reply]
I have moved the archive of the talk page, I hope correctly, following the technique as I read it from PAR. Perhaps someone experienced in such moves may check what I have done.Chjoaygame (talk) 06:24, 8 December 2020 (UTC)[reply]

Collecting opinions[edit]

There is at least one objection above to leading off with the statistical-mechanical description. There is also at least one insistence that motion be mentioned in the first paragraph. If this article is to be about entropy in general — including the popular concept and information entropy — then it's inappropriate to lead off with a purely thermodynamic account. The statistical-mechanical account applies to both thermodynamics and information, and it explains how the physical concept led to the popular concept. That it "explains but does not define" thermodynamic entropy is a subtlety that seriously gums up the mission of getting entropy across to the English major or the financial analyst. I urge certain editors, for the nth time, to get over it in deference to that mission. If they absolutely cannot, perhaps the distinction can be addressed in the body of the article. -Jordgette [talk] 15:22, 5 December 2020 (UTC)[reply]

The words “subtlety that seriously gums up the mission” and “in deference to that mission” remind me that some time ago, Editor Jordgette put his cards on the table thus: “Ironically your argument supports starting this article by describing entropy as a measure of disorder, which is far and away the most intuitive way to describe what it is.” I don't know if he still thinks or intuits so. Such a respectable source as Edwin Thompson Jaynes had an alternative view when he wrote “Glib, unqualified statements to the effect that "entropy measures randomness" are in my opinion totally meaningless, and present a serious barrier to any real understanding of these problems.” I guess intuition and understanding are different.
Assuming that the article is to be re-named to become about entropy in general, it isn't evident to me precisely whither leads Editor Jordgette's just above collection of opinions. In particular, it doesn't mention the most general conception of entropy, that of the mathematical theory of dynamical systems, just a little while ago relegated to the archive. The comments, such as this one, of our IP-only mathematician friend, 67.198.37.16, would return to relevance if the article were re-named to become about entropy in general.Chjoaygame (talk) 00:35, 6 December 2020 (UTC)[reply]
Perhaps the 'spread' interpretation (not definition) has not caught on, but it has entered the mainstream. The 'disorder' doctrine has long dominated, but has led to the present situation, in which this article has been a source of complaint for over a decade. Does that remind me of some fable about doing the same thing and expecting a different result? The 'disorder' story may appeal to many, but it is another question as to whether it offers useful understanding; Jaynes and others think not. Amongst those who have taken the spread interpretation seriously, some have found it helpful for teaching, which has a priority in this article. For me, the 'disorder' interpretation is strained or baffling, while I find the 'spread' interpretation natural and enlightening. Is Wikipedia physics a stronghold for the kind of Groupthink that is characterized by active suppression of alternative viewpoints, or does Wikipedia policy favour a neutral point of view?
The beginning of a list of texts using the energy dispersal interpretation
Atkins, P. W., de Paula J. Atkins' Physical Chemistry, 2006, W.H. Freeman and Company, 8th edition, ISBN 9780716787594
″We appear to have found the signpost of spontaneous change: we look for the direction of change that leads to dispersal of the total energy of the isolated system. This principle accounts for the direction of change of the bouncing ball, because its energy is spread out as thermal motion of the atoms of the floor.″
Bell, J., et al., 2005. Chemistry: A General Chemistry Project of the American Chemical Society, 1st ed. W. H. Freeman, 820pp, ISBN 0-7167-3126-6. I would need a trip to the library to get a copy of this.
Brown, T. L., H. E. LeMay, B. E. Bursten, C.J. Murphy, P. Woodward, M.E. Stoltzfus 2017. Chemistry: The Central Science, 10th ed. Prentice Hall, 1248pp, ISBN 9780134414232
"entropy A measure of the tendency for energy to spread or disperse, thereby reducing its ability to accomplish work. In a general sense, it reflects the degree of randomness or disorder associated with the particles that carry the energy." Editor PAR rightly observes that energy spreads not only in ordinary space, but more accurately also in phase space, but that does impugn use of the spreading idea when it may help. Brown et al. admit the compatibility of the 'disorder' and the 'dispersal' interpretations. So could we.
Ebbing, D.D., and S. D. Gammon, 2017. General Chemistry, 11th ed. Centage Learning 1190pp, ISBN 9781305580343
"Entropy (S) a thermodynamic quantity that is a measure of how dispersed the energy of a system is among the different possible ways that a system can contain energy."
Petrucci, Herring, Madura, Bissonnette 2011 General Chemistry: Principles and Modern Applications, 10th edition, 1426 pages, Pearson Canada ISBN 9780132064521
"Entropy, S, is a thermodynamic property related to the number of energy levels among which the energy of a system is spread. The greater the number of energy levels for a given total energy, the greater the entropy."
Perhaps I may seem lazy, but I became bored with this exercise.Chjoaygame (talk) 07:41, 6 December 2020 (UTC)[reply]
Just a comment. The fact that Atkins' Physical Chemistry uses the energy dispersal interpretation is important in the UK and a few other countries included Australia, where I am, because it is a text book that is very widely used in chemistry departments. Perhaps chemists are now happier with this interpretation than are physicists. However I have been retired for 17 years now. --Bduke (talk) 08:11, 6 December 2020 (UTC)[reply]
The question isn't whether or not spread is a scientifically valid interpretation of energy. Clearly it is (although I'll note again that this interpretation can be explained in light of the statistical mechanic explanation, but not vice versa). The question is whether this interpretation is a useful way to explain the popular interpretation of entropy, which is "disorder," whether we like it or not; my argument has been that it is not. The necessity of this argument stems from the apparent consensus that this article should provide a useful introduction to the scientific idea of entropy to someone who only has some hazy notion of the popular conception. It would be helpful if you could engage with these arguments in some way. DrPippy (talk) 15:30, 6 December 2020 (UTC)[reply]
  • Replying to Editor DrPippy. Thank you for your comment, Editor DrPippy.
You have asserted that "it is not". I have twice asked you how has the 'spread' interpretation worked in your teaching practice, but you have not replied. I don't see your assertion as amounting to an argument.
The 'popular conception', as you observe, is indeed "hazy". To shift from it into some scientific understanding is our task. In the literature written by physicists who have tackled the matter explicitly, the discussion usually concludes that the 'disorder' account is scientifically almost meaningless. It is true that the 'disorder' account is recited in many textbooks, but few actually try to do more than recite it, or to show explicitly how it has scientific meaning. The 'disorder' account works at a handwaving or loosely metaphorical level, but scarcely more. The notion of 'spread' has obvious physical meaning that is not merely handwaving. For example, Planck says of his thermodynamic systems that they are "chiefly ... homogeneous". 'Evenly spread' would pass for an ordinary language rendition of 'homogeneous'. The spread in entropic motion is not perfectly even, beyond equipartition, and that is why I propose the words 'diversity of motion'. The motions of microscopic bodies obey the laws of physics; to refer to such obedience simply as 'disorder' doesn't quite cover it. As is evident in what I have written above, I am not urging expungement of the 'disorder' metaphor. I am, however, putting the case for the more physical 'spread' language as well.Chjoaygame (talk) 16:46, 6 December 2020 (UTC)[reply]
  • DrPippy is of course correct, and I would (once again) urge the two minority-opinion editors on this page to devote their energy to other entropy articles, where the mission to connect the popular concept with the physical concept won't continue to be disrupted.
DrPippy, I appreciate your proposal for the introductory paragraphs (dated 2 December[2]); perhaps that is the direction we want to go in. However, entropy is a physical concept first and a popular concept second, derived from the physical concept. So I think we have to lead with the physical concept. The question is how to do that without making the first few sentences highly unpalatable to the lay reader. -Jordgette [talk] 16:25, 6 December 2020 (UTC)[reply]

Sounds like the discussion is getting bogged down in generalities again. Chjoaygame continues his "wall of words" replies in which he appeals to "authorities" such as Edwin Thompson Jaynes, Edward A. Guggenheim, Peter Atkins, and college textbook authors. WP:TLDR territory. I don't see these as too relevant to this article, at least not the introductory sections. These scientists were writing for other scientists, physicists and chemists, or students of physics and chemistry at the university level. We are writing for readers at a lower level, who have only taken high school or middle school science or maybe no science at all. That means tailoring explanations to their level of understanding, as I think Jordgette and DrPippy are trying to do. We don't need to be limited to one approach, or form of words. Chjoaygame, I think if you want to appeal to textbooks for wording, the appropriate ones are high school or middle school textbooks.

I would say a more productive use of time would be to start editing the article WP:BRD. One or more editors could write a new introduction. Then arguments could be about specific wording, or whether their approach should be abandoned. --ChetvornoTALK 18:41, 6 December 2020 (UTC)[reply]

It is one thing to snip with the scissors, another to choose a good cloth.Chjoaygame (talk) 18:54, 6 December 2020 (UTC)[reply]
Withdrawn proposals & ensuing discussion

Hybrid version of 1st paragraph[edit]

Thank you, Chetvorno. I'm uncomfortable with doing a BRD on the intro, as we still haven't settled on a solid direction for the first paragraph. I shall now attempt a hybrid version of my and DrPippy's proposed first paragraphs, taking into consideration points raised by other editors:

Entropy is a concept in physics, specifically the field of thermodynamics, that has entered popular usage and is also employed in information theory. In popular usage, entropy is often considered to be a measurement of disorder, or to refer a lack of order or predictability, or of a gradual decline into disorder.[citation] This usage stems from its usage in physics, where entropy is understood as a numerical quantity that describes the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement. The equivalence between entropy and disorder arises because states that we recognize as "disordered" almost always have higher entropy than "ordered" states. For example, there are relatively few ways to organize a deck of cards so that it is separated by suit, compared to the number of arrangements where the suits are mixed together. Similarly, there are relatively few ways that particles in a concentrated puff of smoke can be arranged, compared to the number of arrangements after the smoke has spread throughout a room. A shuffled deck and spread-out smoke both have higher entropy than their "well ordered" counterparts.

One will note: "Understood" rather than "defined." "Spread" rather than "evenly distributed." "That we recognize as 'disordered'" rather than "disordered." -Jordgette [talk] 20:38, 6 December 2020 (UTC)[reply]

  • Next talk-page BRD:
The word 'entropy' arose in physics, in the fields of thermodynamics and statistical mechanics. It has come into ordinary language and is also used in informatics and mathematics. In popular usage, entropy refers to disorder, unpredictability, or gradual decline into chaos. In physics, entropy is a numerical quantity that measures the qualitative nature of heat transfer, and the diversity of motion of the many constituent microscopic particles of a body of matter and radiation. Entropy is seen as disorder because such microscopic motion is practically unpredictable and appears chaotic. Entropic motion spreads matter and energy as diversely as possible. The quantity 'entropy' is used in some expressions of the second law of thermodynamics.
This version tells the beginner that entropy in physics concerns the motion of matter and energy rather than, for example, systems of gambling, avoiding the perhaps forbidding word 'system' used to mean 'body of matter and energy'. It prefers reference to motion, avoiding an unphysical suggestion of restriction to static arrangement such as is considered in some mathematical calculations in statistical mechanics. Entropy less 'describes' the different ways than it measures their extent.Chjoaygame (talk) 23:25, 6 December 2020 (UTC)[reply]
  • One brief objection: while I am fine with the popular meaning of entropy being mentioned in the initial paragraph I don't think it should be the lead definition. The article should lead with a technical definition. --ChetvornoTALK 22:51, 6 December 2020 (UTC)[reply]
That is fine. Perhaps the second paragraph can bring in the second law, and the third paragraph can bring in irreversibility, and that's the end of the lead. The other wording proposal fails to satisfy the mission for various reasons previously discussed, and heads us away from the growing consensus here.
Pending further comments, I will take a shot at revising this paragraph and drafting a second and third paragraph. -Jordgette [talk] 23:40, 6 December 2020 (UTC)[reply]
It is not a regular policy of Wikipedia that an article gets to have a specifically defined "mission".Chjoaygame (talk) 00:37, 7 December 2020 (UTC)[reply]

Proposed wording of lead, version 1[edit]

Entropy is a concept in physics, specifically the field of thermodynamics, that has entered popular usage and is also employed in information theory. Entropy is a numerical quantity that measures the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement. In popular usage, entropy is often considered to be a measurement of disorder, or to refer a lack of order or predictability, or of a gradual decline into disorder.[citation] The equivalence between entropy and disorder arises because states that we recognize as "disordered" almost always have higher entropy than "ordered" states. For example, there are relatively few ways to organize a deck of cards so that it is separated by suit, compared to the number of arrangements where the suits are mixed together. Similarly, there are relatively few ways that particles in a concentrated puff of smoke can be arranged, compared to the number of arrangements after the smoke has spread throughout a room. A shuffled deck and spread-out smoke both have higher entropy than their well-ordered counterparts.

Consider the animation of blue and red balls at right [in progress], which can be viewed as a schematic representation of molecules of a gas at two different temperatures, or molecules of two different gases. The balls start out separated, an arrangement we might describe as an "ordered state", with a particular value for the system's entropy. If each ball is allowed to move at random and displace other balls (as happens among gas molecules possessing heat energy), the balls do not stay separated for long. They spontaneously begin to blend, as balls of each color spread among the balls of the other color. In doing so, the entropy of the system rises: the number of arrangements of individual balls that would produce any given overall arrangement goes up considerably. At some point, however, the spread of each color into the other reaches a maximum, and we can no longer discern a "red side" or a "blue side"; any further movement does not appreciably change the situation. In thermodynamics, this point — at which entropy reaches a maximum — is called equilibrium.

The second law of thermodynamics is one of the foundational principles of physics; it states that the entropy of a closed system (i.e., one with no outside influences) tends to increase over time until equilibrium is reached. For example, it is extremely improbable for the entropy of the randomly moving colored balls to decrease, i.e., for the balls to spontaneously regroup back into a "red side" and a "blue side." Likewise, it is extremely improbable for particles of smoke spread throughout a room to reform into a concentrated puff, or for a shuffled deck of cards, upon further shuffling, to spontaneously become reordered by suit.

The second law implies that many physical processes are irreversible. You can pour cream into coffee and mix it, but you cannot "unmix" it; you can burn a piece of wood, but you can't "unburn" it. If you saw a movie of smoke going back into a smokestack or mixed coffee separating into black coffee and cream, you would know that the movie had been reversed. In some cases, however, the entropy of a changing system increases very little. When two billiard balls collide, the change in entropy is very small (a bit of their kinetic energy is lost to the environment as heat), so a reversed movie of their collision might appear normal.

The question of why entropy increases, until equilibrium is reached, was answered in 1854 by Ludwig Boltzmann. The theory developed by Boltzmann and others, known as statistical mechanics, explains thermodynamics in terms of the statistical behavior of the atoms and molecules that make up the system. Later, Claude Shannon applied the concept of entropy to information, such that the entropy of a message transmitted along a wire can be calculated.

Proposed sections of body:

• History of entropy in thermodynamics

• Entropy in information theory

• Entropy in popular culture

-Jordgette [talk] 17:40, 7 December 2020 (UTC)[reply]

Some suggestions. These are just my opinions:
  • "...the entropy of a closed system tends to increase over time". This should be "isolated system". In thermodynamics, a "closed system" is one which can still exchange heat with the outside.
  • "...which can be viewed as a schematic representation of molecules of a gas at two different temperatures..." The red and blue balls animation doesn't apply well to gases at different temperatures, since diffusing molecules at different temperatures don't stay at different temperatures but come to the same temperature.
  • "If each ball is allowed to move at random and displace other balls..." My feeling is one of the important concepts to get across is that what drives entropy increase is that matter at an atomic scale is in constant motion, gas particles are constantly diffusing even in a gas at rest. I'd replace this with something like "At any temperature above absolute zero, the gas particles are constantly in random motion, and so as time passes they get mixed up together..."
  • "...they begin to blend...", "...the spread of each color into the other..." I'd use a word like "mix", "blend" and "spread" might imply to some that the balls combine.
  • "...this point...is called equilibrium.". "Thermodynamic equilibrium"; there are other kinds.
--ChetvornoTALK 20:00, 7 December 2020 (UTC)[reply]
I'm generally on board with this version, and with Chetvorno's suggestions. I wonder if it would be worthwhile tying some (or all) of the examples back to the stat-mech concept of entropy; e.g. something to the effect that there are far more arrangements of red and blue balls intermixed than with each color on its own side, etc. Similarly, more ways to disperse the cream molecules throughout the cup than to have them all in the same region. I think this would help to tie subsequent paragraphs back to the concepts introduced in the first paragraph. Regardless, I'd be happy moving the current suggested version to the main page. It'll need some tweaking, I'm sure, but I think it's pretty close to where it needs to be. DrPippy (talk) 19:20, 8 December 2020 (UTC)[reply]
Billiard balls move not at random, but according to Newton's laws of motion. If we should think of them as modeling gas molecules, we should think of gas molecules as moving not at random, but according to Newton's laws of motion. 'Random motion' is a mathematical or informatical artifice, an intellectual schema, not a physical fact. The macroscopic (thermodynamic) and the microscopic (statistical mechanical) accounts differ in their amounts of detail. The thermodynamic account ignores motion. The statistical mechanical account admits it, but ignores most of its detail. A physical account attends closely to its detail.Chjoaygame (talk) 21:51, 7 December 2020 (UTC)[reply]
Technically, the motion of billiard balls is derived from the laws of QED, themselves an approximation to some more fundamental theory. Both Newton's laws of motion and QED are equally unhelpful in dealing with the behavior of the individual particles involved in macroscopic systems. These considerations will be a distraction in the context of this article. DrPippy (talk) 19:25, 8 December 2020 (UTC)[reply]
Quoting: "Technically, ..." An argumentum ad verecundiam, and itself a distraction.
The laws may not help some mathematicians. But they are the laws of physics, and would help some other mathematicians. Yes, they would distract from the mathematically oriented point of view and doctrine that is being advocated by Editor DrPippy, and by those who say 'no mathematics in this article'.Chjoaygame (talk) 20:09, 8 December 2020 (UTC)[reply]
Boltzmann was aged 10 in 1854.Chjoaygame (talk) 21:57, 7 December 2020 (UTC)[reply]
Quoting: "Consider the animation of blue and red balls at right." Wikipedia policy deprecates giving such instructions to readers. Yes, one finds them in textbooks, but Wikipedia is not a textbook. Perhaps a Wikipedia compliant version might read 'The animation of blue and red balls at right [in progress] models the molecules of two different gases.'Chjoaygame (talk) 22:05, 7 December 2020 (UTC)[reply]
Agreed. DrPippy (talk) 19:25, 8 December 2020 (UTC)[reply]
Yes, 'spread' can suggest continuity, and that can be unfitting. 'Dispersal' conveys the idea of discreteness.Chjoaygame (talk) 22:13, 7 December 2020 (UTC)[reply]
Quoting: "Entropy is a numerical quantity that measures the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement."
For a dynamical system such as a body composed of rapidly moving molecules, it is not too clear what is meant by an "arrangement". Nor is it too clear what is meant by the "same overall arrangement" of such molecules. Yes, talk of 'arrangement' is sometimes useful to simplify mathematical derivations of formulas for statistical entropy.
But this article is not primarily about mathematics. For such a dynamical system, physical thinking is about trajectories, orbits, rotations, and collisions. For a body composed of molecules orbiting or even nearly immobilized in each others' influences, rotating, moving in more or less free trajectories, and colliding, statistical entropy measures the extent or dispersal of diversity of motion. Mathematical calculation of that is not easy, and, for most bodies of matter, is just not feasible. But the physical idea of extent or dispersal of diversity of motion is not too difficult.
It is a remarkable and useful fact that, for many purposes, one can ignore such dispersal of diversity of motion, and simply distinguish two forms of transfer of energy into or out of such a body, as thermodynamic work, and as heat. Thermodynamic entropy is measured in a subtle way, by transferring heat into or out of the body slowly, in little increments. For the entropy measurement, the quality of each little heat increment is registered quantitatively. The registering quantity is the temperature of the body at the time of the incremental transfer. From these data, the thermodynamic entropy is calculated.
Some bodies of matter and energy have such exceptionally simple microscopic compositions that it is feasible to actually perform the mathematical calculation of their statistical mechanical entropies. (One outstanding simplifying characteristic is that the body consists just of freely moving molecules, hardly interacting with one another except when they collide.) For such simply composed bodies, the calculated statistical mechanical entropies very closely match the respective measured thermodynamic entropies. In some cases, the match is so close that it provides an accurate calibration of thermometers.Chjoaygame (talk) 23:39, 7 December 2020 (UTC)[reply]
When billiard balls collide inelastically, the lost kinetic energy mostly enters the the balls themselves as heat, and partly departs as sound. The spread as heat into the surroundings is slower. In the context of thermodynamics, it is traditional, and, I think desirable, to speak not of 'the environment', but, rather, of  'the surroundings'.Chjoaygame (talk) 23:49, 7 December 2020 (UTC)[reply]
Quoting: "it states that the entropy of a closed system (i.e., one with no outside influences) tends to increase over time until equilibrium is reached."
Quoting: "At some point, however, the spread of each color into the other reaches a maximum, and we can no longer discern a "red side" or a "blue side"; any further movement does not appreciably change the situation. In thermodynamics, this point — at which entropy reaches a maximum — is called equilibrium."
In thermodynamics, entropy registers a discrete change from one state of thermodynamic equilibrium to another. It does not grow gradually over time to reach a maximum. Talk of gradual increase of entropy over time is slick and seductive, and common enough, but misleading. I think it better to avoid starting someone off with a misleading idea. A state of thermodynamic equilibrium lasts practically for ever. The microscopic instantaneous state traverses a trajectory, exploring the whole of the accessible phase space. No particular point in the the trajectory is an 'equilibrium' point. It is the trajectory as a whole that constitutes the thermodynamic equilibrium. Entropy is a property of the trajectory in phase space.Chjoaygame (talk) 00:13, 8 December 2020 (UTC)[reply]
Quoting: "it is extremely improbable for the entropy of the randomly moving colored balls to decrease".
Yes, people do use concepts of randomness and probability as a mathematical artifice for the calculation of entropy. Yes, the statistical entropy of a system of moving coloured balls can be mathematically calculated, or at least approximated, by ignoring some details of their motion and treating their motion as random, and even ignoring their motion, treating them as if they are somehow just 'arranged'. But the entropy of a state of thermodynamic equilibrium, like the state itself, is unchanging in time; that is why thermodynamic entropy is a useful measurement. If the balls form a system that models thermodynamic equilibrium, it is not just "extremely improbable for the entropy ... to decrease"; it is self-contradictory.Chjoaygame (talk) 11:51, 8 December 2020 (UTC)[reply]

Would another editor kindly parse the above and opine on which elements, if any, are relevant to this article and its intended audience? Clearly the Boltzmann date is wrong (should be 1872). Much of this text is copied from earlier proposals. Thank you. -Jordgette [talk] 15:45, 8 December 2020 (UTC)[reply]

At the risk of being overly blunt: I started reading through Chjoaygame's comments above and trying to respond, but I'm finding them very difficult to decipher. I have a few suggestions that would make it easier to incorporate Chjoaygame's views into the article:
  • It seems as though you disagree with (or at the very least, have some reservations about) the apparent consensus to move this article towards providing a layman's introduction to the concept of entropy. If so, could you provide a concise explanation of your points of disagreement? This would make it a lot easier to engage with your objections.
  • From my vantage point, it seems as though many (possibly most) of the points you raise lack an obvious relevance to the topic at hand, and also involve a fairly idiosyncratic understanding of the physics involved. (E.g., bringing up Newtonian mechanics in the context of thermal motion in macroscopic systems is pretty out there...) It can be illuminating to consider scientific ideas from an unusual vantage point (given appropriate sources, etc.). But it's important to be very clear about how it relates to more traditional approaches. (I'd also question if this particular article is the right venue for that sort of thing.)
  • A bit more brevity would make it easier for me to grasp the most salient points you're trying to make.
I'm assuming your edits are well-intentioned and made in good faith, but at the moment I'm finding it hard to see how to even address your concerns, much less incorporate them in usable main-page text. DrPippy (talk) 20:43, 8 December 2020 (UTC)[reply]
Thank you for your kind invitation. I have said what I think I ought to say. I doubt that my saying more would help. I guess it is now time for me to retire from this talk.Chjoaygame (talk) 21:20, 8 December 2020 (UTC)[reply]

Hi folks, sorry for the delay. Give me another couple of days to finish up the animation and put a version together incorporating the feedback. I'll plug it into the article for further feedback and refining, and then we'll move on to the body. -Jordgette [talk] 03:45, 11 December 2020 (UTC)[reply]

As per previous discussions, a distinction MUST be made between thermodynamic entropy (S, a state variable of a thermodynamic system), information entropy (Log[W], a measure of the "spread" of some probability density function), and the statistical mechanics EXPLANATION of thermodynamic entropy, (S=k Log[W], which links the two by assuming that matter is composed of particles, and that every microstate available to a system in a given macrostate is equally probable. The statement that "Entropy is a numerical quantity that measures the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement" is not good. The "Entropy" referred to here is the information entropy as applied to a system of particles assuming each available microstate is equally probable. It is not thermodynamic entropy, but is related to thermodynamic entropy via Boltzmann's statistical mechanical theory (S=k Log[W]). Thermodynamic entropy is defined by the second law, and the laws of thermodynamics have NOTHING to do with any statistical mechanical concepts and should not be interpreted as such. PAR (talk) 18:19, 11 December 2020 (UTC)[reply]
Quoting: "assuming each available microstate is equally probable." The physical problem is to so formulate or specify the microstates that every available one has one and the same probability.Chjoaygame (talk) 10:12, 13 December 2020 (UTC)[reply]
This distinction is not rocket science, it's not some esoteric knowledge understandable only to those who have studied these subjects for years. A twelve year old can understand the difference between learning to drive a car (analogous to thermodynamics), and understanding the inner workings of a car which explains why it drives the way it does (analogous to statistical mechanics). PAR (talk) 18:19, 11 December 2020 (UTC)[reply]
I agree with PAR that the thermodynamic definition of entropy in terms of state variables should be mentioned in the introduction. Maybe something like: "In thermodynamics entropy is defined as the fraction of internal energy that is unavailable for doing work divided by the temperature". I don't agree with PAR that the thermodynamic concept of entropy has nothing to do with the statistical mechanics concept, but explaining the relationship between the two is probably too involved for the introduction and should be kept in the article body. --ChetvornoTALK 18:44, 11 December 2020 (UTC)[reply]
I didn't mean to imply that the thermodynamic concept of entropy has nothing to do with the statistical mechanics concept. They are most definitely related, by Boltzmann's S=k Log[W] equation. My point was that thermodynamic entropy does not refer to, nor does it need to refer to, any statistical mechanical concept, in order to be completely defined. Boltzmann's equation is not an identity, it is a theory, one that is exceedingly successful in explaining thermodynamic entropy, but it provides two equivalent definitions of thermodynamic entropy in theory only, not in fact. The first and second laws of thermodynamics are a collection of facts organized by the principles of conservation of energy and non-decreasing thermodynamic entropy. They make no reference to any statistical mechanical concept. Statistical mechanics is an extremely successful theory which explains these facts. If any statistical mechanical theory is ever at odds with thermodynamics, then it is simply wrong, and not vice versa. PAR (talk) 19:23, 11 December 2020 (UTC)[reply]
Then it looks like we're back to square one on the intro. I was hoping that we could find a way to introduce entropy to people who have no idea what thermodynamics, or internal energy, or work are — but apparently that's simply impossible. I've done what I can, and I'm really not interested in litigating this any further. I'll leave the animation here for review upon its completion, and interested editors can use it or not use it in the article as they please. -Jordgette [talk] 19:26, 11 December 2020 (UTC)[reply]
I don't think people need to understand thermodynamics, internal energy, or work in order to understand irreversibility, and that's what is in the introduction. I feel like I am banging my head against a wall, it's so simple. Thermodynamic entropy is about irreversibility, and you can explain irreversibility to a twelve year old, by offering examples. Popular attempts to describe entropy like disorder, spreading, lost information, are all useful but ultimately do not hit the mark. We can use them to give a feel for entropy, but to declare them to be the final word is to lie.
I'm the one who objects to lying in the name of simplification when we don't need to. You are the one who has held people's feet to the fire demanding simplicity, and rightly so. You are the one who made me understand that these popular (mis)conceptions of entropy must be addressed and addressed quickly in the introduction, in order to let people who have these (mis)conceptions know they are on the right page. In what way are we not threading that needle? I've tried, and I wish you would not give up on this. I know, this discussion gets tiring after a while, listening to me and Chjoaygame and Chetvorno and Dr. Pippy and BDuke sounding like we argue about how many angels can dance on the head of a pin, but the bottom line is that we are trying to thread that needle. Please, just understand what the introduction is saying, and point out your objections and explain the flaws. People learn entropy as a mushed up mess of thermodynamics, statistical mechanics, and flawed popular takes on entropy, and then simply won't let go. I will let go if I am wrong, but the only way that happens is a reasoned discussion/argument. PAR (talk) 21:01, 11 December 2020 (UTC)[reply]
The reason the discussion gets "tiring" is that you are talking in generalities rather than proposing concrete improvements to the article. What specific sentences do you object to in Jordgette's version above and what would you change them to? --ChetvornoTALK 02:09, 12 December 2020 (UTC)[reply]
  • The first paragraph: "Entropy is a concept in physics, specifically the field of thermodynamics, that has entered popular usage and is also employed in information theory. Entropy is a numerical quantity that measures the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement." This is so wrong in a number of ways. There is thermodynamic entropy and there is information entropy. They are two different things, yet the distinction is destroyed in this sentence by referring to "entropy" as if they were the same thing. Thermodynamic entropy is "a concept in physics, specifically the field of thermodynamics". Thermodynamic entropy is defined by a measurement of thermodynamic variables on a real system. It is not "employed" in information theory. Information entropy is a measure of the "spread" of a hypothetical probability distribution. Information entropy, combined with the atomistic assumption and the assumption of equal probability of microstates, yields the statistical mechanical entropy - "a numerical quantity that measures the number of different ways that the constituent parts of a system can be arranged to get the same overall arrangement." If we assume that matter is composed of particles, we can define a microstate and a macrostate and if the probability of every microstate is assumed to be equal (=1/W), then we can define a statistical mechanical entropy k Log[W] and propose that it is equal to the thermodynamic entropy. These need not be esoteric concepts. Thermodynamic entropy reveals itself to our senses qualitatively and quantitatively as irreversibility. Easy to understand, given a few examples. The more difficult part is the statistical mechanical explanation of thermodynamic entropy. We have to introduce microstates and macrostates in as simple a way as possible, and show how the microstate is constantly changing, and propose that each microstate is equally probable. Now we can show by simple examples how the macrostate is almost always the equilibrium state, which presents itself to us as an unvarying macrostate. This will hopefully demonstrate the core conclusions of statistical mechanics.
  • The second paragraph: A further description of information entropy applied to statistical mechanics. As long as it doesn't imply that this is a representation of thermodynamic entropy, or that entropy is completely explained by the spreading process, that's fine.
  • the third paragraph - a conceptual link between thermodynamic entropy and stat mech entropy is introduced. That's fine, but it's out of order.
  • the fourth paragraph - Finally, thermodynamic entropy is introduced. This should come first.
  • The fifth paragraph introduces statistical mechanics - a subject covered by most of the above paragraphs. Statistical mechanics should be introduced after thermodynamics, and then described. Then there is the statement: "Later, Claude Shannon applied the concept of entropy to information, such that the entropy of a message transmitted along a wire can be calculated." This statement is totally false. Shannon, working on information theory, developed a measure of (missing) information, which he called "information entropy". It is a measure of the "spread" of a hypothetical probability distribution function. He applied that to the practical problem of message transmission. In retrospect, this same information entropy was applied by Boltzmann in his development of statistical mechanics. It is just the "Log(W)" in his famous equation.
In contrast, I believe the present introduction avoids these problems and roughly attempts to present things in a simple, logical order. Please list your objections to the present introduction. PAR (talk) 06:22, 12 December 2020 (UTC)[reply]

I'm fundamentally at odds with PAR's position at this. The laws of thermodynamics emerge from statistical mechanics, not the other way around. The reason that entropy increases is fundamentally due to the statistical properties of large systems of particles; if thermodynamics didn't have the Second Law, statistical mechanics would demand it. If I'm understanding PAR's point about the relationship between thermo and stat mech (e.g., "If any statistical mechanical theory is ever at odds with thermodynamics, then it is simply wrong, and not vice versa."), it simply amounts to an argument that theories have to conform to observations. This is not in dispute, but doesn't really provide any useful guidance for why the thermodynamic view of entropy should take precedence over the statistically mechanical view. The fact of the matter is that stat mech entropy explains thermo entropy; the reverse is not true. Nor does thermo entropy provide a particularly clear explanation of why the second law implies irreversibility, tendency to disorder, etc.; the stat mech version does.

Thermodynamics says that there's some quantity defined by dS=dQ/T, and that dS>=0 for any spontaneous process in an isolated system, etc. This is sort of like observing that planetary orbits are elliptical. This observation is explained by Newton's law of gravity, which also explains a number of other phenomena. Arguing that we should start introducing the concept of entropy with the narrower thermodynamic definition strikes me as similar to arguing that we should start an introduction to gravity by talking about the shape of planetary orbits rather than with Newton's law.

I also think that the stat mech version is potentially easier to understand. I'm not sure that there's a way to frame entropy as a measurement of heat that's unavailable to do work (or whatever version of this you prefer) that isn't going to feel pretty abstract to the uninitiated; at least, I haven't heard one that really works for me. If we don't mind sacrificing some degree of technical rigor (and I think that's okay in this context), we could simply say that entropy is a measurement of the probability that a system will find itself in some particular state/arrangement/configuration, and thus the second law is simply a result of a system in a relatively improbably state moving to increasingly probably states as the deck is slowly shuffled, so to speak. In fact, I think I would prefer something along these lines even to the "number of arrangements" wording that Jordgette and I have been working with. (This definition is technically wrong, but it's wrong in sort of the same way that Newtonian gravity is wrong compared to GR, and you definitely wouldn't want to start off a layman's introduction to gravity by talking about geodesics in curved spacetime and all that.)

TL;DR version: stat mech provides an explanation of the concept of (physical) entropy which is more explanatory, more intuitive, and more fundamental than thermodynamics does, so we should lead with that. I would prefer to explain the entropy in terms of probability instead of multiplicity, but I feel less strongly about that.

We seem to be at a bit of an impasse: I think Jordgette and I are more or less on the same page, and possibly Bduke as well; PAR and Chjoaygame are not on that page, but their objections seem to some from different directions. Is it time to pursue some sort of dispute resolution process (RfC, etc.)? DrPippy (talk) 14:26, 12 December 2020 (UTC)[reply]

Excellent - Thank you for a clear explanation of your point of view that recognizes the difference and the relationships between thermodynamic entropy and statistical mechanical entropy. I like the analogy with planetary motion. I think (maybe) this list a set of things we can agree on:
  • Thermodynamics is a phenomenological theory, a condensation of experimental, macroscopic facts, which are constrained by the phenomenological laws of thermodynamics. It defines thermodynamic entropy. This situation is similar to the theory of planetary motion at the time of Kepler - a series of measurements which showed that planetary orbits were elliptical, and constrained by Keplers three phenomenological laws of planetary motion.
  • Statistical mechanics is a theory which, using information theory and the idea that matter is particulate, and that microstates corresponding to a particular macrostate all have equal probability, and that energy and particle number is conserved, is able to explain and clarify thermodynamic entropy, and the laws of thermodynamics. Statistical mechanics defines a statistical mechanical entropy (k Log(W)) and equates it to the thermodynamic entropy (S). This situation is similar to Newton's law of gravitation, which explains and clarifies Keplers laws.
  • Information theory is a mathematical abstraction which can be applied to many different situations involving probability distributions. Information entropy is also a mathematical abstraction. It is somewhat similar to the integral and differential calculus that Newton invented to deal with his theory of gravitation, but which can be applied to a much broader set of problems.
If you find these statements agreeable, then the thing we disagee on is which theory, thermodynamics or statistical mechanics, comes a priori and which comes a posteriori, both conceptually and from a pedagogical (teaching) "introduction to entropy" point of view. Good, that we can argue about. My point of view is that thermodynamic entropy comes first (a priori), then statistical mechanics (a posteriori), both historically, conceptually and pedagogically. You disagree, on the last two and I can't "prove" you to be wrong. Have I described the situation accurately? If so, this is a lot of progress, and our point of contention is clarified.
PS - I found a very interesting web page that I keep referring back to: [3]. Although it roughly supports my present point of view, your above explanation causes me to question my present point of view, so I take this article as informative rather than definitive. See especially Einstein's quote on constructive theories versus theories of principle. PAR (talk) 00:16, 13 December 2020 (UTC)[reply]
Quoting from that article: "It is well accepted that protein and lipid self-assembly is a direct consequence of the second law of thermodynamics." It doesn't violate the second law, but I think it presumes a lot to say "direct consequence".
Here, Editor PAR is referring to Einstein's inversion of Boltzmann's formula. Einstein also allowed fluctuations of 'entropy' at times. We may also remember that Einstein thought that classical thermodynamics, within its scope of applicability, will never be overthrown.Chjoaygame (talk) 01:39, 13 December 2020 (UTC)[reply]
That is an interesting article! I'll need to go back and look at it more closely. I'd nitpick a little bit regarding the extent to which statistical mechanics borrows from information theory (which I unfortunately don't know much about). As I understand it, though, information theory has some version of "entropy" which has a similar mathematical form to stat-mech entropy, but there's no version of the second law for info theory. Physical entropy, though, is important (as I understand it) precisely because of the second law, which explains a lot of things ranging from the direction of heat flow to the ultimate fate of the universe. So I'm a little leery of trying to draw interpretive connections between physical and informational entropy, because it's not immediately obvious what those connections should be, or even if they exist beyond the fact that they look mathematically similar.
With all that said, I think there's probably a way to start with the thermodynamic version of entropy and then move from there to the stat-mech version, and certainly that's what happened historically. I'm very happy to wait and see what y'all come up with and then offer critiques and suggestions; this is probably fairer and more constructive than me focusing on potential conceptual problems with the approach. Regardless of how it turns out, it's been a fun and interesting conversation! DrPippy (talk) 15:29, 17 December 2020 (UTC)[reply]
Good to have your comment. Glad to hear you've found it fun.
Agreed that statistical mechanics doesn't borrow from information theory. As I see it, information theory provides a fresh way of reading or interpreting or expressing the ideas of statistical mechanics. As you say, this is largely because they use the same mathematical formulas. The link is in combinatorics.
Disagree that the second law explains things. For me, it just describes them, using language derived from the distinction between heat and work. Strongly reject the widely made proposal that the second law has much useful to say about the fate of the universe.Chjoaygame (talk) 16:05, 17 December 2020 (UTC)[reply]
@Dr. Pippy - Again, thank you for putting a fine point on the nature of our disagreements. Because of that, and the article I mentioned, I have been practicing the "statistical mechanics is a priori" approach. I still ultimately don't like it, but it took some blinders off. Also, statistical mechanics most certainly does "borrow" from information theory:
A quick take on info entropy (H) in statistical mechanics: In information theory, for the general discrete case:
where pi is the probability of the i-th outcome, whatever you choose that to be: Dice roll, message, microstate, whatever. There are a total of W outcomes so that:
As applied in statistical mechanics, each microstate (outcome) is assumed to be to be equally probable, so that, by the second equation, pi = 1/W. Plug that into the first equation and you get the information entropy H = Log(W), which is the same Log(W) in Boltzmann's equation S = k Log(W). PAR (talk) 16:39, 17 December 2020 (UTC)[reply]
Boltzmann's equation is such a beautiful thing - it unites the information entropy, Log(W), Boltzmann's constant k, and the theoretical statistical mechanical entropy k Log(W) which it equates to the thermodynamic entropy S. PAR (talk) 17:13, 17 December 2020 (UTC)[reply]
Since we are going into detail, acccording to Cercignani,[1] Boltzmann knew and published the formula,
,
though that fact is not widely celebrated. (Very tied up right now, please give me till tomorrow to find page number.)Chjoaygame (talk) 19:37, 17 December 2020 (UTC)[reply]
  1. ^ Cercignani, C. (1998). Ludwig Boltzmann: the Man who Trusted Atoms, Oxford University Press, Oxford UK, ISBN 9780198501541.
Quoting from Cercignani, p. 8, quoting Boltzmann: "... erroneous to believe that the mechanical theory of heat is therefore afflicted with some uncertainty because the principles of probability theory are used. ... It is only doubly imperative to handle the conclusions with the greatest strictness."
Quoting from Cercignani, p. 8, commenting on Boltzmann: "But he also seemed to think that he had obtained a result which, except for these fluctuations, followed from the equations of mechanics without exception."
I don't recall us mentioning the origin of the 'disorder' doctrine. Perhaps this quote from Cercignani, p. 18, may help, though I think it isn't enough to settle the matter: "In 1877 he published his paper “Probabilistic foundations of heat theory”, in which he formulated what Einstein later called the Boltzmann principle; the interpretation of the concept of entropy as a mathematically well-defined measure of what one can call the "disorder" of atoms, which had already appeared in his work of 1872, is here extended and becomes a general statement."
Coming to the present point. On page 18, Cercignani writes "In the same year [1877] he also wrote a fundamental paper, generally unknown to the majority of physicists, who by reading only second-hand reports are led to the erroneous belief that Boltzmann dealt only with ideal gases; this paper clearly indicates that he considered mutually interacting molecules as well, with non-negligible potential energy, and thus, as we shall see in Chapter 7, it is he and not Josiah Willard Gibbs (1839-1903) who should be considered as the founder of equilibrium statistical mechanics and of the method of ensembles."
On page 55, Cercignani quotes Boltzmann: "The assumption that the gas-molecules are aggregates of material points, in the sense of Boscovich, does not agree with facts." Boltzmann knew, from spectroscopy, that atoms must be intricately complex objects.
On page 64, Cercignani writes: "Thermodynamics ... can be regarded as a limitation of our ability to act on the mechanics of the minutest particles of a body ..." I think this is a wise statement. My reason is that it does not appeal to probability. Maxwell's demon has the ability that we lack.
On page 82, Cercignani writes about Carnot: "Essentially he saw that there was something that was conserved in reversible processes; this was not heat or caloric, however, but what was later called entropy." This might help us talk about the relation between (ir)reversibility and entropy, a matter that Editor Chetvorno has raised. The characteristic of entropy is that it increases as a result of a thermodynamic process. The pure mode of entropy generation is in heat transfer. In contrast, ideally pure work transfer generates no entropy. (At the risk of being accused of some crime, I may say that work is like heat transfer from a body that is at infinite temperature: ; such transfer can heat any body in the surroundings. To extract all the internal energy of a body as work, we need a heat reservoir at zero temperature.)
On page 83, Cercignani writes: "The first attempts at explaining the Second Law on the grounds of kinetic theory are due to Rankine [22, 23]." Rankine (I forget the exact date, but about 1849 or 1850) used a quantity that he called "the thermodynamic function", later called 'entropy' by Clausius.
On pages 83–84, Cercignani writes: "Boltzmann himself makes his first appearance in the field with a paper [25] [1866] in which he tries to prove the Second Law starting from purely mechanical theorems, under the rather restrictive assumption that the molecular motions are periodic, with period , and the awkward remark, which might perhaps be justified, that “if the orbits do not close after a finite time, one may think that they do in an infinite one”. Essentially, Boltzmann remarks that temperature may be thought of as the time average of kinetic energy, while heat can be equated to the average increase in kinetic energy; if we compute the unspecified period from one of the relations and substitute the result into the other, it turns out that the heat divided by the temperature is an exact differential. This part of the paper appears to be a rather primitive justification of the first part of the Second Law; as for the second part, Boltzmann's argument belongs more to pure thermodynamics than to statistical mechanics and leads to the conclusion that entropy must increase in an irreversible process." I guess that is the Poincaré recurrence time.
My automatic tldr alarm is ringing loudly and I will stop for now.Chjoaygame (talk) 16:04, 18 December 2020 (UTC)[reply]
Here is a comment from the point of view of Jaynes's 'mind projection fallacy' argument.
The particles can't read the observer's mind. So the physical question is not 'when and how much will the particles decide to move randomly so as to violate the laws of motion?' No, the physical question is 'at what stage of the Poincaré recurrence cycle will the observer choose to observe the body in its state of thermodynamic equilibrium?' That tells us the physical source of the 'subjectivity' of the probabilistic view of entropy.Chjoaygame (talk) 18:57, 18 December 2020 (UTC)[reply]
For the microscopic view of a body of matter and radiation in its own state of internal thermodynamic equilibrium, the physical entropy belongs to the trajectory of the constituent particle set through its phase space. Physical entropy measures how the trajectory in the chosen body extends in or partly fills phase space. A trajectory may be characterized by the Lie group of the law of motion that generates it, along with its Poincaré recurrence time. Mathematical entropy is labeled by the Lie group of the law of motion that generates the trajectory. It does not change in time. The observer subjectively or randomly chooses when to observe or sample the trajectory. Statistical mechanics is a mathematical algorithm to find probabilities for the various states which the trajectory will have reached at the respective various sampling times, as well as to calculate the time-invariant entropy of the trajectory.Chjoaygame (talk) 23:24, 18 December 2020 (UTC)[reply]
Our task here is to find good ways for the article to introduce ideas about entropy. On page 81, Cercignani offers a thought: "In any case there remained the important unsolved problem of deducing the Second Law of Thermodynamics. As is well known from elementary physics, this principle is often subdivided into two parts, according to whether we consider just reversible processes or irreversible processes as well." Carnot considered the most efficient imaginable heat engine, that conserved entropy, thought of by Carnot as the precious "caloric", not to be wasted, running on the quasi-equilibrium and infeasible Carnot cycle. Later, physicists considered other processes, feasible and producing entropy, wasting energy as heat.
Cercignani remarks on page 84 that Maxwell knew the reason for the second law: “"Hence, if the heat consists in the motion of its parts, the separate parts which move must be so small that we cannot in any way lay hold of them to stop them" [19]. In other words, the Second Law expresses a limitation on the possibility of acting on those tiny objects, atoms, with our usual macroscopic tools.” The reason for the second law is our incompetence and caprice, not nature's imputed gambling. Many of us would prefer to exonerate our incompetence and caprice, and instead convict nature of gambling.
On page 86, Cercignani points to the mathematical trick used in ergodic theory: "As remarked by M.J. Klein [6], Boltzmann interprets Maxwell's distribution function in two different ways, which he seems to consider as a priori equivalent: the first way is based on the fraction of a sufficiently long time interval, during which the velocity of a specific molecule has values within a certain volume element in velocity space, whereas the second way (quoted in a footnote to paper [3]) is based on the fraction of molecules which, at a given instant, have a velocity in the said volume element." I find the first way more physical, the second more statistical. Cercignani continues: "It seems clear that Boltzmann did not at that time feel any need to analyse the equivalence, implicitly assumed, between these two meanings, which are so different."
On page 89, Cercignani tells us that Boltzmann knew that his derivation for ideal gases did not extend to more general substances, and that his H-function is not in general a thermodynamic entropy: "Boltzmann however warned his readers against the illusion of an easy extension of his calculations to the case of more complicated interaction laws." The illusion remains seductive today.
On page 99, Cercignani quotes for us how Boltzmann dealt with the famous objection raised by his good friend Loschmidt: "One therefore cannot prove that, whatever may be the positions and velocities of the spheres at the beginning, the distribution must become uniform after a long time; rather one can only prove that infinitely many more initial states will lead to a uniform one after a definite length of time than to a non-uniform one." Boltzmann assumes that the observation time is prescribed and that the initial microscopic instantaneous state is randomly chosen. Alternatively, randomness can be created from a given initial microscopic instantaneous by the physicist's capricious choice of when to make the final observation. In contrast, physical entropy measures extent of physical diversity of lawful motion of microscopic constituents.
Also on page 99, Cercignani tells us his ideas on Boltzmann's word "order": “We remark that the use of terms such as "ordered states" or "uniformity" may be confusing, since there are various levels of description at which one can consider "order".”
On page 146, Cercignani quotes Poincaré on the same topic: “Thus the notions of molar geordnet (ordered at a molar level) or molekular geordnet (ordered at a molecular level) do not seem, in my opinion, to have been defined with sufficient precision.”
Why would the novice not suffer the confusion felt by Cercignani, Poincaré, Jaynes, Grandy, and others?
On page 100, Cercignani writes: "It is only when we pass to a reduced description, based on the one-particle distribution function, that we lump many states into a single state and we can talk about highly probable (disordered) states; these are states into which, in the reduced description, an extremely large number of microscopic states are lumped together." It takes some intellectual effort to see to the relevance of disorder for physics. Should we demand that from the novice?
On page 104, Cercignani quotes Grad (1961) on the problem of multiple different meanings that people find in the one word 'entropy': "On the other hand, much of the confusion in the subject is traceable to the ostensibly unifying belief (possibly theological in origin!) that there is only one entropy. Although the necessity of dealing with distinct entropies has become conventional in some areas, in others there is an extraordinary reluctance to do so." I think that Cercignani himself at times fails on this.
On page 123, Cercignani tells of Boltzmann's use of the above formula.Chjoaygame (talk) 06:50, 19 December 2020 (UTC)[reply]
I did not know that Boltzmann came up with Shannon's entropy before Shannon did. I'm not sure how the rest of the exposition relates to the previous statement.PAR (talk) 07:34, 19 December 2020 (UTC)[reply]
According to Cercignani, it's not too widely celebrated. It was new to me when I read Cercignani. We are familiar with Boltzmann's use of the formula, but we didn't know that he used it in considering the canonical ensemble under the name 'holode', before Gibbs.
The rest of the exposition is because I read Cercignani more closely to find the reference and on the way found lots of more or less relevant bits that might be handy for us.Chjoaygame (talk) 09:39, 19 December 2020 (UTC)[reply]

Problems with the "Heat and entropy" section[edit]

This section appears to be focused on justifying the interpretation of the entropy change in a phase change as a spatial "spreading" of energy. It then concludes with the statement that "The important overall principle is that energy of all types changes from being localized to becoming dispersed or spread out, if not hindered from doing so." and that "Entropy (or better, entropy change) is the quantitative measure of that kind of a spontaneous process: how much energy has been transferred/T or how widely it has become spread out at a specific temperature."

This is simply wrong. The entropy of mixing at equal pressure and temperature involves no spatial spreading of energy or energy/T. The energy density is constant. Furthermore, it is fair to say that energy "spreads out" among all degrees of freedom, and to say that energy only spreads spatially is to say that energy can only spread out in the spatial degrees of freedom, which contradicts the first statement. This belongs in the "Introductory descriptions of entropy" section. PAR (talk) 19:03, 11 December 2020 (UTC)[reply]

I removed the "Heat and Entropy" section and, since it appeared to be written by a proponent of the "energy dispersal" idea, I expanded the "Energy dispersal" section with references, and including the energy dispersal quote. PAR (talk) 03:02, 14 December 2020 (UTC)[reply]

Critique of recent edits[edit]

  • I don't think the relabeling of the section from "Introductory descriptions of Entropy" to "Versions of Entropy" is a good idea. I think there should be two subsections - basically thermodynamic and statistical mechanical approximate takes on entropy. The Statistical mechanical would include what is now "Microstate dispersal" (with corrections, see below), disorder, and missing information. The "Microstate dispersal" section is in the realm of statistical mechanics. PAR (talk) 10:49, 14 December 2020 (UTC)[reply]
As you please.Chjoaygame (talk) 12:39, 14 December 2020 (UTC)[reply]
  • In the "Microstate dispersal" section, the statement "ordinary language interpretation of entropy as 'dispersal of microstates throughout their accessible range'." is wrong. Microstates do not disperse, and Guggenheims original statement "If instead of entropy one reads number of accessible states, or spread, the physical significance becomes clear." does not justify the rewording. I also think that a phrase that contains "... that characterizes the fluid, crystalline oscillatory, phononic, molecular, atomic, subatomic, and photonic structure and composition of bodies of matter and radiation." is not appropriate for an introductory article. PAR (talk) 10:49, 14 December 2020 (UTC)[reply]
I have tried to address this.Chjoaygame (talk) 18:25, 14 December 2020 (UTC)[reply]
  • In the introduction, I think removing the paragraph which introduces and explains equilibrium and reversibility and replacing it with a paragraph that uses those as-yet-undefined terms is wrong. I have re-inserted the lost paragraph before that. PAR (talk) 10:49, 14 December 2020 (UTC)[reply]
Defined thermodynamic equilibrium, with a view.Chjoaygame (talk) 18:43, 14 December 2020 (UTC)[reply]
As I read you, the re-inserted paragraph is
Entropy does not increase indefinitely. As time goes on, the entropy grows closer and closer to its maximum possible value.[1] For a system which is at its maximum entropy, the entropy becomes constant and the system is said to be in thermodynamic equilibrium. In some cases, the entropy of a process changes very little. For example, when two billiard balls collide, the changes in entropy are very small and so if a movie of the collision were run backwards, it would not appear to be impossible. Such cases are referred to as almost "reversible". Perfect reversibility is impossible, but it is a useful concept in theoretical thermodynamics.
footnote
  1. ^ Strictly speaking, thermodynamics only deals with systems in equilibrium. The idea that entropy is continuously "changing" is actually an approximation in which the change is considered to be a number of individual steps, each step being an equilibrium state derived from the previous one.
I don't like that paragraph because it would let the novice forget that a thermodynamic process is defined by the values of its initial and final state variables, and that, given them, it is allowed to take any path including those far from thermodynamic equilibrium.Chjoaygame (talk) 23:01, 14 December 2020 (UTC)[reply]
I agree - I will change it. PAR (talk) 05:43, 15 December 2020 (UTC)[reply]
Great.Chjoaygame (talk) 06:12, 15 December 2020 (UTC)[reply]
Thank you for that. To save possible re-edits, I will talk here.
As I read it now, the relevant paragraph reads
Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of thermodynamic equilibrium. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: An ice cube in a glass of warm water will not spontaneously form from a glass of cool water. Some processes in nature are almost reversible. For example, the orbiting of the planets around the sun may be thought of as practically reversible: A movie of the planets orbiting the sun which is run in reverse would not appear to be impossible.
The sentence "Entropy does not increase indefinitely" is problematic. It seems to be a consequence or residue from the previous paragraph, which is itself more problematic, or unacceptable. The problem is here: "which says that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time.[1]"
note
  1. ^ Theoretically, coffee can be "unmixed" and wood can be "unburned", but for this you would need a "machine" that would generate more entropy than was lost in the original process. This is why the second law only holds for isolated system which means they cannot be connected to some external "machine".
The technical term 'system' is not ideal for an introductory lead, because it is a technical term. '... an isolated system ... which is undergoing change' is not a thermodynamic system in its own state of internal thermodynamic equilibrium and so does not have a defined thermodynamic entropy. Thermodynamic entropy increases process by process. "coffee can be unmixed" calls for a bit of imagination. Part of the problem is that the mixing was not a spontaneous thermodynamic process; it was driven by "you", an animate agency. Once lit, wood burns by itself. Perhaps it may be convenient to omit the coffee and cream?
Instead of "Irreversibility is described by an important law of nature known as the second law of thermodynamics, which says that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time.[2]", I would suggest instead 'Irreversibility is described by a law of nature known as the second law of thermodynamics, which says that when bodies of matter and radiation interact intimately, their total entropy increases.'
If this is accepted, then the sentence "Entropy does not increase indefinitely" can well be omitted.
If this is accepted, then the second statement of the second law can be omitted.
Adressing some details:
"An isolated body of matter and radiation eventually will reach an unchanging state ..."
"Thermodynamic entropy has a definite value for such a body. and is at its maximum value" The words "and is at its maximum value" are problematic. They don't say with respect to which constraints is the maximum. The novice can scarcely do better than to guess. I propose to omit them.
"... a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water." A problem is that "a glass of warm water with an ice cube in it" is not a thermodynamic system in its own state of internal thermodynamic equilibrium and so has no defined entropy. A thermodynamic process needs a thermodynamic operation to start. I suggest 'For example, when an ice cube is put into a glass of warm water, and allowed to melt, leaving a glass of cool water, then the total entropy increases.'
"Such processes are irreversible: ..." I would write 'Such a process is irreversible: ...'
"... would not appear to be impossible." I would write '... would appear to be possible.'
Assembling this, here is a proposal.
The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder.[1] Physical interpretations of entropy refer to spread or dispersal of energy and matter, and to extent and diversity of microscopic motion.
In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, once lit, a piece of wood burns by itself, and doesn't "unburn". Again, when an ice cube is put into a glass of warm water, and allowed to melt, it leaves a glass of cool water; the ice cube will not re-form, leaving itself in warm water. Such one-way processes are said to be irreversible. Some processes in nature may for many purposes be regarded as reversible, for example, the orbiting of the planets around the sun. If you reversed a movie of burning wood, you would see things that you know are impossible in the real world. On the other hand, run in reverse, a movie of the planets orbiting the sun would appear to be possible.
An isolated body of matter and radiation, in an unchanging state, with no detectable flows, is said to be in a state of thermodynamic equilibrium. Thermodynamic entropy has a definite value for such a body. Irreversibility is very precisely described in terms of entropy by a law of nature known as the second law of thermodynamics, which says that when bodies of matter and radiation interact intimately, their total entropy increases.
While the second law, and thermodynamics in general, is accurate in its predictions of intimate interactions of bodies of matter and radiation, scientists are not content with simply knowing how things behave, but want to know also WHY they do so. The question of why entropy increases in thermodynamic processes was tackled over time by several physicists, regarding bodies of matter and radiation as composed of extremely small particles, such as atoms. A masterly answer was given in 1877 by Ludwig Boltzmann, in terms of a theory that is now known as statistical mechanics. It explains thermodynamics in terms of the diverse microscopical motions of the particles, analysed statistically. The theory explains not only thermodynamics, but also a host of other phenomena which are outside the scope of thermodynamics.
Your thoughts?Chjoaygame (talk) 05:08, 17 December 2020 (UTC)[reply]
I see what you are saying. I am thinking of the statistical meechanical entropy, which IS defined for nonequilibrium states. Any state, disequilibrium or not, has a macrostate, and an associated set of microstates, and therefore the statistical mechanical entropy k Log(W) has meaning, even though the thermodynamic entropy S does not. I think you are saying that since we have not yet introduced the stat mech entropy, we should not be talking in stat mech terms. I looked at the Second law of thermodynamics page, and the use of the words "increase" and "decrease" are all over the place. Strictly speaking, they should never be used to describe a thermodynamic entropy change. Only "larger than" or "equal" or "smaller than" should be used. I wonder if this is a case where rigor is a hinderance to the newcomer rather than a help. I have been accused of being too rigorous, but I have my limits. What do you think? signature added: PAR, posted 20:45, 17 December 2020.
Quoting: I looked at the Second law of thermodynamics page, and the use of the words "increase" and "decrease" are all over the place. Strictly speaking, they should never be used to describe a thermodynamic entropy change. Only "larger than" or "equal" or "smaller than" should be used. I wonder if this is a case where rigor is a hinderance to the newcomer rather than a help. I have been accused of being too rigorous, but I have my limits. What do you think?
I am not too worried about this. I guess that increases and decreases can be stepwise or gradual and continuous.
More concerning to me is Editor Chetvorno's comment:
Quoting: But the title of the article is not Irreversibility, it is Entropy, the introduction gives readers no idea of what that is. "...entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion." is not going to mean anything to readers below college level.
Three points here. 'Spread of energy or matter' is incomprehensible. The topic of the article is 'entropy', not 'irreversibility'. The lead does not give an applicable definition of 'thermodynamic entropy'.
The usual Wikipedia custom is that the lead is a summary. I am distinguishing between the lead and a possible section, in the body of the article, entitled 'Introduction'. I see the lead as a kind of guidepost to the body of the article, not containing any more explanatory material or technical detail than is necessary. This view of mine seems a bit at odds with some other views on this page, that seem to want the lead to explain and define in detail. I think when Editor Chetvorno writes "introduction" he is referring to what I call the lead. His three points call for close attention.
Quoting: Any state, disequilibrium or not, has a macrostate, and an associated set of microstates, ...
It depends on what you mean by 'macrostate'. For the general non-equilibrium body of matter and radiation, in general, the thermodynamic state variables, such as temperature, pressure, and in some cases internal energy and suchlike, are undefined. That they are defined in special cases is what makes thermodynamics. The general non-equilibrium state has non-zero flows of matter and energy, and such flows are permitted to be turbulent: hard to define a macrostate then. Boltzmann's H-function is based on a very special non-equilibrium state, and it may be very hard indeed to define precisely a corresponding function, with a suitable sample space or ensemble, for other non-equilibrium states. People look at Boltzmann's H-function, see that it looks like Shannon's function, say to themselves 'Shannon's function has been labeled 'entropy', therefore Boltzmann's H-function is an entropy, therefore I can say that it defines physical non-equilibrium entropy'. As far as I know, the only serious attempt to define a true thermodynamic entropy for non-equilibrium is Phil Attard's hierarchy of multi-time 'entropies'. He wrote to me that he didn't know how to measure them. Futuristic stuff. Off the table for us, I think. In the meantime, I think that it is just a card-trick or word game to speak of 'non-equilibrium entropy' in general as if it had physical statistical mechanical meaning.
Quoting: I think you are saying that since we have not yet introduced the stat mech entropy, we should not be talking in stat mech terms.
I didn't have that thought in mind. Not concerned about it yet.Chjoaygame (talk) 03:42, 18 December 2020 (UTC)[reply]
You said: "I am not too worried about this. I guess that increases and decreases can be stepwise or gradual and continuous." But you said you had a problem with the statement "Entropy does not increase indefinitely". PAR (talk) 07:36, 19 December 2020 (UTC)[reply]
The problem I see with the sentence "Entropy does not increase indefinitely" is that it relies for its logic on the paragraph previous to it, and that paragraph is faulty. Chjoaygame (talk) 09:26, 19 December 2020 (UTC)[reply]
I am concerned that you are referring to a paragraph that has been removed. Could you re-read the previous paragraph and tell me if you have objections to it and, if so, what they are.PAR (talk) 20:32, 19 December 2020 (UTC)[reply]
You quoted Chetvorno: "But the title of the article is not Irreversibility, it is Entropy, the introduction gives readers no idea of what that is." This is not true, as seen by the statement "In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time.". The only way thermodynamic entropy presents itself to our senses and measuring instruments is through its change. Thermodynamic entropy cannot be experimentally measured, only its change. PAR (talk) 07:36, 19 December 2020 (UTC)[reply]
I don't mean that Editor Chetvorno's sentence is fully compelling; "no idea" is an exaggeration. I just mean that we should attend to his concern that 'entropy' has not been fully and precisely defined at that point. An attribute of thermodynamic entropy has been declared, but not a fully precise definition. He is perhaps concerned that other attributes of entropy still remain to be declared. It's not immediately obvious how to respond to his concern, but I think some attention is needed. Chjoaygame (talk) 09:26, 19 December 2020 (UTC)[reply]
I think a definition of thermodynamic entropy is too much for an introduction section and should go in the explanation section.PAR (talk) 20:36, 19 December 2020 (UTC)[reply]
I tend to agree, but I think we need to attend to Editor Chetvorno's concern. I like to distinguish between a lead and a section in the body of the article entitled 'Introduction'. Chjoaygame (talk) 20:49, 19 December 2020 (UTC)[reply]
Regarding my statement "Any state, disequilibrium or not, has a macrostate, and an associated set of microstates, .". Yes, I have to amend that statement. It holds for cases where, e.g. the pressure and temperature for each type of particle are defined functions of space and time (local thermodynamic equilibrium). Now entropy density can be defined and total entropy can be said to decrease, assuming we can talk about the rates of energy and entropy tranfer between two equilibrated systems in contact and almost in equilibrium with each other. But the question is, how does this apply to an introductory article? Can we use the words "Entropy does not increase indefinitely" as an introductory statement, knowing that it is flawed? I think the verbal gymnastics we have to go thru to keep things rigorous but simple are almost overwhelming in this case. PAR (talk) 07:36, 19 December 2020 (UTC)[reply]
Local thermodynamic equilibrium is a relatively new concept. I think it was introduced by Milne to deal with radiation. I am not sure of its exact history. It was used by others such as Onsager, Prigogine, Gyarmati, and De Groot & Mazur. It's an approximation, reliant on strong assumptions. I think that Prigogine 1947 has conceptual problems. Yes, I agree that it is not for the lead of this articleChjoaygame (talk) 09:26, 19 December 2020 (UTC)[reply]

cards on the table[edit]

(Discussion deleted and continued on User_talk:PAR)

It seems to me that none of this material by Chjoaygame is relevant for this article, which is an "Introduction to .." article. Some of it may be relevant to Entropy, but I am not even clear about that. This article should be understood by someone who has just been introduced to the topic of entropy. Chjoaygame, please stop wasting our time reading through your long edits to see if anything is relevant to this article. --Bduke (talk) 22:27, 20 December 2020 (UTC)[reply]

I am just as guilty for running down this rabbit hole with Chjoaygame. I agree, it has no place on this talk page, and I have removed it to my talk page. Chjoaygame and I have had many arguments in the past, and yes, he tends to say in two complicated paragraphs what he could say in one sentence. In his defense, he will actually engage in an argument, rather than ignoring what you say and repeating the same POV over and over again. He will adjust his understanding as need be and does not engage in ad-hominem attacks. This is rare, and I try to do the same. The discussion actually does derive from a disagreement about the wording of the introduction, and if we reach a consensus of two on my talk page, offer any conclusions here. PAR (talk) 23:45, 20 December 2020 (UTC)[reply]

Outstanding questions[edit]

I thought it might be useful to take stock of where we are in terms of points of agreement or disagreement. So here's what I see as three major questions where we need to achieve consensus, together with what I've seen as the likely answers. (I trust y'all will add anything that I've missed here!) Not trying to break any new ground here; just hoping to provide an organizational framework where it might be easier to keep track of the discussion (which I'm having a bit of trouble following, tbh). Hopefully this is something we can !vote on, and it'll make it clearer where things stand. DrPippy (talk) 14:58, 12 December 2020 (UTC)[reply]

It may be kept in mind that Wikipedia articles are subject to perpetual re-editing by individual editors.Chjoaygame (talk) 20:33, 19 December 2020 (UTC)[reply]
Thats the whole point of achieving a consensus - If six editors can compromise their way to a consensus, that will be six editors who will object to somebody who just read their first pop-entropy book and is determined to "set things straight". PAR (talk) 20:44, 19 December 2020 (UTC)[reply]
Some individual editors are not like that. And a consensus is not necessarily needed to knock off such a straight-setting.Chjoaygame (talk) 20:58, 19 December 2020 (UTC)[reply]

Question 1: How much math should this article incorporate?[edit]

Option 1: No math at all
Option 2: No math in the introduction, some basic math in the body of the article
Option 3: Basic math throughout, including introduction
Option 4: The article should incorporate an advanced mathematical treatment of entropy; introduction might or might not have some math, but we should end up in the deep end.

  • Option 1 on this for me; I think this would hang together better as a purely conceptual explanation of the idea of entropy in various contexts. I'd be okay with option 2 as well. I think any math in the intro is a bad idea. DrPippy (talk) 14:58, 12 December 2020 (UTC)[reply]
  • Option 2, but keep it at a very simple level in the body.PAR (talk) 00:27, 13 December 2020 (UTC)[reply]
  • Option 2, agree with PAR --ChetvornoTALK 20:06, 17 December 2020 (UTC)[reply]

Question 2: Which views of entropy should be incorporated into this article?[edit]

Option 1: Thermodynamic
Option 2: Stat mech
Option 3: Informational
Option 4: Popular (disorder, etc.)
Other?: I'm sure I've left some out...

  • Options 1, 2, 4 for sure, no opinion on 3. Again, I think the approach of trying to explain the popular view of entropy in terms of the scientific view is exactly the right one; we just need to figure out which version of the scientific view is the right one. DrPippy (talk) 14:58, 12 December 2020 (UTC)[reply]
  • Options 1,2,3,4, low emphasis on the nuts and bolts of information entropy.PAR (talk) 00:27, 13 December 2020 (UTC)[reply]
  • Options 1,2,3,4 Since this article's title is Introduction to entropy, we have to define all the types of entropy in the introduction, but I agree Shannon entropy should just be briefly defined WP:SUMMARYSTYLE. --ChetvornoTALK 20:20, 17 December 2020 (UTC)[reply]
  • Comment. 'Other' might include or point to general abstract mathematical entropies, that might pretend or intend to 'really capture' the 'underlying concept' of 'entropy'.Chjoaygame (talk) 20:39, 19 December 2020 (UTC)[reply]

Question 3: Which version of entropy should we lead off with[edit]

Option 1: Thermodynamic
Option 2: Stat mech
Option 3: Informational
Option 4: Popular (disorder, etc.)
Option 5: Other/none of the above/all of the above

  • Option 2 is my strong preference, for the reasons given above. The short version is that the stat mech picture of entropy (1) explains the popular view; (2) explains the thermodynamic view; (3) explains many other related concepts (diffusion, irreversibility, heat flow, others); and, most importantly (4) does all of these in an intuitive way which is accessible to a popular audience. DrPippy (talk) 14:58, 12 December 2020 (UTC)[reply]
  • Option 1 is the best way to lead a new reader into an understanding of thermodynamic entropy. Pose the problem (thermodynamic entropy) then the solution (statistical mechanics). Vice versa is backwards. PAR (talk) 00:27, 13 December 2020 (UTC)[reply]
  • Boltzmann thought that the physics of thermal motion must reflect Newton's laws (we agree, modulo technicalities). Phase space expresses or relies on Newtonian physics. The mathematical problem, expressed in terms of phase space, posed here by Newton's physical laws, taken literally, is formidable, practically overwhelming for Boltzmann and for some generations after him. Based on phase space, probabilistic thinking is a mathematical artifice, relying on ergodic theory, to get around the mathematical difficulties. Some physicists say that 'mathematics is the language of physics'. Thermodynamics is about physical facts. One approach wants ordinary language conceptions, explanation, theory, and mathematical reasoning, to be placed in the article ahead of physical facts. Editor PAR wants to put the physical facts first.
'Entropy' is a word that expresses several more or less distinct concepts that belong to several respective distinct frames of thinking.Chjoaygame (talk) 01:19, 13 December 2020 (UTC)[reply]
  • I agree with user PAR that we should start with thermodynamic entropy. The reason is that it is more clearly related (in many cases) to familiar macroscopic experience. Most readers know that hot water mixes with cold water to give lukewarm water, or that gases under pressure expand, etc. etc. Thermodynamic entropy gives a simple way of predicting the direction of a process, as introduced by Carnot with purely macroscopic reasoning. Statistical reasoning explains why thermodynamic entropy works, but it requires a fairly detailed molecular picture to understand properly, so in an Introduction article it should be presented after thermodynamic entropy and only briefly. Dirac66 (talk) 02:06, 13 December 2020 (UTC)[reply]
  • Option 2: The problem with Option 1 is that it gives no understandable definition of entropy. The existing introduction explains thermodynamic entropy is related to irreversibility. I like the approach and I think some of it should be saved in the new introduction. But the title of the article is not Irreversibility, it is Entropy, and the introduction gives readers no idea of what that is. "...entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion." is not going to mean anything to readers below college level. The thermodynamic definition, the change in thermal energy per unit temperature, similarly doesn't give readers any insight into why this should be related to irreversibility (although I think it probably should be in the introduction). Therefore we have to give some kind of statistical mechanics def up front. That, and the diffusion animation, will explain the relation to irreversibility, and also the popular meaning of entropy as disorder. --ChetvornoTALK 20:06, 17 December 2020 (UTC)[reply]

footnote style calls for editorial re-think[edit]

There are two footnotes in the lead that are formatted as references.

[2] [3]

  1. ^ "Definition of entropy in English". Lexico Powered By Oxford. Retrieved 18 November 2020.
  2. ^ Theoretically, coffee can be "unmixed" and wood can be "unburned", but for this you would need a "machine" that would generate more entropy than was lost in the original process. This is why the second law only holds for isolated system which means they cannot be connected to some external "machine".
  3. ^ Strictly speaking, thermodynamics only deals with systems in equilibrium. The idea that entropy is continuously "changing" is actually an approximation in which the change is considered to be a number of individual steps, each step being an equilibrium state derived from the previous one.

Their logics are not far enough from 《I have just said 'X', but more precisely I should have said 'not X'.》 Deeply poor style, not Wikistyle (my neologism I guess). They call for thorough editorial re-thinks. Supposing there is still a place for footnotes, they should be formatted as footnotes, not as references, though, in  my view, footnotes are by their nature usually poor style for Wikipedia, and are much better avoided altogether.Chjoaygame (talk) 22:51, 15 December 2020 (UTC)[reply]