[HN Gopher] The Second Law of Thermodynamics (2011)
___________________________________________________________________
The Second Law of Thermodynamics (2011)
Author : luu
Score : 130 points
Date : 2024-07-16 00:12 UTC (22 hours ago)
(HTM) web link (franklambert.net)
(TXT) w3m dump (franklambert.net)
| Harmohit wrote:
| As another comment mentioned, this website does look like Time
| Cube at first sight.
|
| However, the explanations of the second law of thermodynamics on
| the second page are quite decent and written in a humorous way.
| Of course it is not fully accurate because it does not use any
| math but I think it does a good enough job of explaining it to
| the lay person.
|
| The explanations about human life at the third page are analogous
| at best. The situations that the author describes are similar to
| the workings of the second law but not a first principles outcome
| of it.
| exmadscientist wrote:
| A neat little corollary to this is to look a little more closely
| at what temperature actually _is_. "Temperature" doesn't appear
| too often in the main explanation here, but it's all over the
| "student's explanation". So... what is it?
|
| The most useful definition of temperature at the microscopic
| scale is probably this one: 1/T = dS / dU, which I've simplified
| because math notation is hard, and because we're not going to
| need the full baggage here. (The whole thing with the curly-d's
| and the proper conditions imposed is around if you want it.)
| Okay, so what does _that_ mean? (Let 's not even think about
| where I dug it up from.)
|
| It's actually pretty simple: it says that the inverse of
| temperature is equal to the change in entropy over the change in
| energy. That means that temperature is measuring how much the
| _entropy_ changes when we add or remove _energy_. And now we
| start to see why temperature is everywhere in these energy-
| entropy equations: it 's the link between them! And we see why
| two things having the same temperature is so important: _no
| entropy will change_ if energy flows. Or, in the language of the
| article, _energy would not actually spread out any more_ if it
| would flow between objects at the same temperature. So there 's
| no flow!
|
| The whole 1/T bit, aside from being inconvenient to calculate
| with, also suggests a few opportunities to fuzz-test Nature. What
| happens at T=0, absolute zero? 1/T blows up, so dS/dU should blow
| up too. And indeed it does: at absolute zero, _any_ amount of
| energy will cause a _massive_ increase in entropy. So we 're
| good. What about if T -> infinity, so 1/T -> zero? So any
| additional energy induces no more entropy? Well, that's real too:
| you see this in certain highly-constrained solid-state systems
| (probably among others), when certain bands fill. And you do
| indeed observe the weird behavior of "infinite temperature" when
| dS/dU is zero. Can you push further? Yes: dS/dU can go _negative_
| in those systems, making them "infinitely hot", so hot they
| overflow temperature itself and reach "negative temperature"
| (dS/dU < 0 implies absolute T < 0). Entropy actually _decreases_
| when you pump energy into these systems!
|
| These sorts of systems usually involve population inversions
| (which might, correctly, make you think of lasers). For a 2-band
| system, the "absolute zero" state would have the lower band full
| and the upper band empty. Adding energy lifts some atoms to the
| upper band. When the upper and lower band are equally full,
| that's maximum entropy: infinite temperature. Add a little more
| energy and the upper band is now more full than the lower: this
| is the negative temperature regime. And, finally, when
| everything's in the upper band, that is the exact opposite of
| absolute zero: the system can absorb no more energy. Its
| temperature is maximum. What temperature is that? Well, if you
| look at how we got here and our governing equation, we started at
| 0, went through normal temperatures +T, reached +infinity,
| crossed over to -infinity, went through negative temperatures -T,
| and finally reached... -0. Minus absolute zero!
|
| (Suck on that, IEEE-754 signed zero critics?)
|
| And all that from our definition of temperature: how much entropy
| will we get by adding a little energy here?
|
| Thermodynamics: it'll hurt your head even more than IEEE-754
| debugging.
| vinnyvichy wrote:
| More intuitively: that TdS has the same "units" as -PdV
| suggests that temperature [difference] is a "pressure"
| (thermodynamic potential) that drives entropy increase.
| kgwgk wrote:
| It has the same "units" (if you mean "energy") as mc2 as well
| and that doesn't suggest anything to me... Your intuition is
| much better than mine - or it's informed by what you know
| about temperature.
| vinnyvichy wrote:
| Sorry! I meant they have the same form as used in the
| energy differential (1-form), but I had thought "units"
| would make more sense. In fact, this comparison was how I
| came to the intuition, although, as you coyly suggested, I
| did do a check with my earlier intuitions..
|
| https://physics.stackexchange.com/questions/415943/why-
| does-...
| kgwgk wrote:
| I agree that thermodynamic relations - and Legendre
| transformations - are fascinating. I don't think I ever
| fully understood them though - at least not to the point
| where they became "intuitive" :-)
| vinnyvichy wrote:
| Erm sorry again to have implied they were intuitive, all
| I meant was that it was relatively intuitive --maybe i
| should have said "retrievable in a high-pressure concept-
| doodling game" --compared to a wall of text..
| kgwgk wrote:
| No need to apologize! I was joking, I think I get what
| you mean.
| vinnyvichy wrote:
| If you let me flash you my (still indecent) state of
| intuition..
|
| "convex conjugates" (more precisely but limited sense
| "momentum maps") are delimited continuations in a
| optimization algorithm.
|
| https://en.wikipedia.org/wiki/Convex_conjugate
| https://en.wikipedia.org/wiki/Delimited_continuation
| 082349872349872 wrote:
| Delimited continuations are to exponentials as convex
| conjugates are to implications?
| vinnyvichy wrote:
| I'm pretty sure I don't understand the possible meanings
| of what you said there either so let's try :)
|
| <layman-ish op-research lingo>
|
| I meant that the tangent to the convex conjugate
| ("momentum") provides bounds on what the values returned
| by the dual step in a primal-dual algo should be. I don't
| know which meaning of "exponential" I should focus on
| here (the action perhaps? A power set? A probability
| distribution?), but "implications" seem to refer to a
| constraint on outputs contingent on the inputs so I will
| go with that. Delimited continuations seem to be the
| closest thing I found in the PL lit, aka wikipedia, feel
| free to suggest something less kooky :)
|
| </lol>
| shiandow wrote:
| It's also precisely what will show up if you use Lagrange
| multipliers to maximize entropy given a fixed energy. (though
| for that to make sense you're no longer looking at a single
| state, you're optimizing the probability distribution itself)
| vinnyvichy wrote:
| Yeah I have been ruminating on the strange coincidence in
| the naming of Lagrange multipliers, Lagrangian, Lagrangian
| duals..
|
| (See below about my comment on convex conjugates and
| delimited continuations)
| yamrzou wrote:
| I like the following related explanation (https://www.reddit.co
| m/r/thermodynamics/comments/owhkiv/comm...) :
|
| > Many people focus on the statistical definition of entropy
| and the fact that entropy increases for any spontaneous
| process. Fewer people are familiar with thinking about entropy
| as the conjugate thermodynamic variable to temperature. Just as
| volumes shift to equalize pressure, areas shift to equalize
| surface tension, and charges shift to equalize voltage, entropy
| is the "stuff" that shifts to equalize temperature. (Entropy is
| of course also unique in that it's generated in all four
| processes.) Entropy is thus in some ways the modern version of
| the debunked theory of caloric.
| passion__desire wrote:
| > Just as volumes shift to equalize pressure, areas shift to
| equalize surface tension, and charges shift to equalize
| voltage, entropy is the "stuff" that shifts to equalize
| temperature.
|
| I remember watching videos of Leonard Susskind in which he
| talked about a similar phenomenon where circuit complexity
| itself increases till it maximizes. It behaves similar to
| entropy.
|
| Complexity and Gravity - Leonard Susskind
|
| https://youtu.be/6OXdhV5BOcY?t=3046
|
| https://www.quantamagazine.org/in-new-paradox-black-holes-
| ap...
| n_plus_1_acc wrote:
| I like this explanation, but I feel it builds on a good
| understanding of entropy
| yamrzou wrote:
| If you want an independent definition of temperature without
| reference to entropy, you might be interested in the Zeroth
| Law of Thermodynamics
| (https://en.wikipedia.org/wiki/Zeroth_law_of_thermodynamics).
|
| Here is a intuitive explanation for it from [1]:
|
| "Temperature stems from the observation that if you bring
| physical objects (and liquids, gases, etc.) in contact with
| each other, heat (i.e., molecular kinetic energy) can flow
| between them. You can order all objects such that:
|
| - If Object A is ordered higher than Object B, heat will flow
| from A to B.
|
| - If Object A is ordered the same as Object B, they are in
| thermal equilibrium: No heat flows between them.
|
| Now, the position in such an order can be naturally
| quantified with a number, i.e., you can assign numbers to
| objects such that:
|
| - If Object A is ordered higher than Object B, i.e., heat
| will flow from A to B, then the number assigned to A is
| higher than the number assigned to B.
|
| - If Object A is ordered the same as Object B, i.e., they are
| in thermal equilibrium, then they will have the same number.
|
| This number is temperature."
|
| [1] https://physics.stackexchange.com/a/727798/36360
| meindnoch wrote:
| Yes, but this still allows infinitely many "temperature"
| scales. I.e. take the current definition of temperature,
| and apply any nondecreasing function to it.
| jwmerrill wrote:
| From later in [1]
|
| > Mind that all of this does not impose how we actually
| scale temperature.
|
| > How we scale temperature comes from practical
| applications such as thermal expansion being linear with
| temperature on small scales.
|
| An absolute scale for temperature is determined (up to
| proportionality) by the maximal efficiency of a heat engine
| operating between two reservoirs: e = 1 - T2/T1.
|
| This might seem like a practical application, but
| intellectually, it's an important abstraction away from the
| properties of any particular system to a constraint on all
| possible physical systems. This was an important step on
| the historical path to a modern conception of entropy and
| the second law of thermodynamics [2].
|
| [1] https://physics.stackexchange.com/a/727798/36360
|
| [2] https://bayes.wustl.edu/etj/articles/ccarnot.pdf
| bollu wrote:
| Does the temperature actually change discontinuously in a
| physical system from -infty to +infty, or is it a theoretical
| artifact that does not show up experimentally?
| kgwgk wrote:
| Depending on what you mean by "discontinuously" it always
| does: the microscopic world is "discrete".
|
| Instead of thinking of "temperature" you may think of
| "inverse of temperature" and then there is no issue with that
| number going "continously" from very negative to very
| positive.
| edngibson wrote:
| Interesting - you're a great writer!
| dougSF70 wrote:
| Is the special case of 1/T = 0 also known as the Big Bang?
| goatsneez wrote:
| 2nd law only states a direction, however, does not determine the
| rate of change of things. It is also related to the spontaneity
| of reactions. What is the role of activation energy (or other
| weak/strong nuclear force potential barriers due to state).
|
| What prevents everything happening all at once (just by obeying
| 2nd law is there a reason?). And if there is, is there a
| consistent formulation of 2nd law + other law that get this
| problem, at least macroscopically correct?
| kordlessagain wrote:
| Related: https://www.sidis.net/animate.pdf
| robaato wrote:
| The classic Flanders and Swann explanation:
| https://www.youtube.com/watch?v=VnbiVw_1FNs
|
| Excerpts: No one can consider themsleves educated who doesn't
| understand the basic language of science - Boyle's law: the
| greater the external pressure the greater the volume of hot air.
| I was someone shocked to learn my partner not only doesn't
| understand the 2nd law of thermodynamics, he doesn't even
| understand the first!
|
| : Heat won't pass from the cooler to hotter! You can try it if
| you like but you'd far better notta!
|
| M: Heat is work and work's a curse M: And all the heat in the
| universe M: Is gonna cool down, M: 'Cos it can't increase M: Then
| there'll be no more work M: And there'll be perfect peace D:
| Really? M: Yeah, that's entropy, Man.
| jaredhansen wrote:
| Thank you for posting this! I'd never heard it, and it's great.
| foobarian wrote:
| I could never wrap my head around the abstract concepts used in
| these explanations because they don't connect to what is actually
| happening at the atomic level. As far as I could tell the actual
| particles are undergoing a constant process of reducing the
| potential energy induced by force fields between them, which
| means everything is just jiggling all the time and spreading
| further and further apart. Heat is just some metric describing
| the aggregate behavior.
| 11101010001100 wrote:
| This all applies at the quantum level. Ask your quantum
| computing friends why we don't have quantum computers yet.
| marcosdumay wrote:
| > the actual particles are undergoing a constant process of
| reducing the potential energy induced by force fields between
| them
|
| Not really. They are in the process of spreading that energy as
| equally possible through as many fields as they can.
|
| What is the Second Law of Thermodynamics.
| passion__desire wrote:
| > Don't put me down. I could have snowed you with differential
| equations and diagrams instead of what you see everyday. We're
| being practical and visual rather than going the math route,
| essential as that is in chemistry.
|
| > The big deal is that all types of energy spread out like the
| energy in that hot pan does (unless somehow they're hindered from
| doing so) They don't tend to stay concentrated in a small space.
|
| I am trying to loosely connect big ideas here so I might be
| wrong. If there is fundamental feature of a universal law, then
| that feature must manifest itself at all scales, as the above
| statements tries to put forward visually. Maybe this idea of flow
| spreading out is very general and some kind of summarization of
| that finer grain flow to coarser flow in the form of Green's
| theorem or Stoke's Theorem is very general.
|
| Kinematic Flow and the Emergence of Time
|
| https://arxiv.org/abs/2312.05300
| ninetyninenine wrote:
| This is what confuses people. There is this universal law, but
| you already know about it.
|
| It's probability. Increasing Entropy is a result of
| probability. That's all it is.
|
| When you have a bunch of particles and you jostle the particles
| it is MORE probable for the particles to become spread out then
| it is to become concentrated in one corner. That probability is
| what is behind this mysterious force called entropy.
|
| Why is it more probable? You just count the amount of possible
| states. There are MORE possible "spread out" states then there
| are "concentrated states". In Most systems there are more
| disorganized states then there are organized states.
|
| Think of it in terms of dice. If you roll 10 dice, how likely
| are you to get some random spread of numbers vs. all the
| numbers concentrated on 6? Or all numbers concentrated on 1?
|
| It's more probable to get a random spread of numbers because
| there are astronomically more possibilities here. For all
| numbers concentrated on 1,2,3,4,5, or 6 you only have a total
| of 6 possible states, all ones, all twos, all threes... all
| sixes... that's total six states.
|
| Random spread occupies 46650 possible states (6^6 - 6). Hence
| by probability things are more likely to become disordered and
| spread out simply because there are more possible disordered
| states.
|
| Entropy is a phenomenon of probability. People mistake it for
| some other fundamental law that mysteriously occurs. No it's
| not, it makes sense once you understand probability.
|
| The real question is, what is probability? Why does it happen
| to work? Why does probability seem to follow an arrow of time,
| it doesn't seem symmetrical like the rest of physics.
| lajy wrote:
| It makes even more sense when you take the law of large
| numbers into account. The scale we're experiencing most
| things on is /so/ far removed from the scale on which these
| probabilities are being expressed.
|
| There are more molecules in a cup of water (on the order of
| 10^24) than there are cups of water in the ocean. If you have
| a cup of water's worth of matter, you aren't just rolling 10
| dice (or even 1000 dice) and looking for mostly 6s. You're
| rolling a few septillion dice and hoping for a significantly
| non-normal distribution. It just isn't feasible.
| kgwgk wrote:
| > Why does probability seem to follow an arrow of time, it
| doesn't seem symmetrical like the rest of physics.
|
| One cannot really oppose probability to "the rest of
| physics". Probability is not "a part of" physics. Probability
| is what we use to describe imperfect knowledge of a physical
| system - and we know more about the past than we do about the
| future.
| cubefox wrote:
| There is also an interesting relation between the second law of
| thermodynamics and the cosmological principle (which says "the
| distribution of matter is homogeneous and isotropic on large
| scales"):
|
| The second law of thermodynamics says that the universe has an
| entropy gradient in the time dimension, while the cosmological
| principle says that the universe has _no_ matter gradient in the
| spatial dimensions.
|
| So together they describe how the universe (space-time) is
| structured, i.e. on the temporal dimension and the spatial
| dimensions.
|
| It's also noteworthy that one enjoys the honorific "law" while
| the other is merely called a "principle". I wonder whether this
| is just an historical artifact or whether there is some
| theoretical justification for this distinction. (My intuition is
| that both are more "principles" [approximate tendencies?] than
| fundamental laws, since they don't say what's possible/impossible
| but rather what's statistically likely/unlikely.)
| stonemetal12 wrote:
| >merely called a "principle"
|
| Merely a principle? In science principles are what
| mathematicians call Axioms. Not proven but taken as true
| because you have to start somewhere, and it is the only thing
| that makes sense.
|
| The cosmological principle is the philosophical position that
| physics works the same everywhere. We haven't done physics
| experiments across the universe, so we can't call it a law
| because there is not enough experimental evidence.
| j45 wrote:
| Is anyone else hearing the word thermodynamics pronounced by
| Homer J?
|
| It's tinting my ability to read this.
| waldrews wrote:
| Repeal the Second Law! Free Energy for Everyone!
___________________________________________________________________
(page generated 2024-07-16 23:01 UTC)