[HN Gopher] Entropy: A little understood concept in physics [video]
       ___________________________________________________________________
        
       Entropy: A little understood concept in physics [video]
        
       Author : guptarohit
       Score  : 185 points
       Date   : 2023-07-02 10:23 UTC (12 hours ago)
        
 (HTM) web link (www.youtube.com)
 (TXT) w3m dump (www.youtube.com)
        
       | bugs_bunny wrote:
       | An interesting read related to this is the following article,
       | which begins with von Neumann's comment that ``Nobody really
       | knows what entropy really is."
       | 
       | https://www.researchgate.net/publication/228935581_How_physi...
        
       | floatrock wrote:
       | The part that was new to me was the bit about how a space full of
       | life tends toward more entropy faster than the same amount of
       | space without life.
       | 
       | Like the best ideas, it's simple and makes sense if you think
       | about it, but it's still a really interesting framing that the
       | complex machinery of life is really just the most efficient
       | "entropy converter".
       | 
       | If there's something about the arrow of time that speeds towards
       | the heat death of the universe, we're just helping it go a tiny
       | bit faster here on our floating speck of dust.
        
         | Piezoid wrote:
         | There is a video by PBS Space Time on that subject:
         | https://youtu.be/GcfLZSL7YGw
        
         | tgv wrote:
         | Isn't there a theorem that says that the overall rate is
         | constant in a closed system? If you look at the universe as a
         | whole, it would be closed, and we wouldn't have any impact.
         | However, I learnt about entropy decades ago, and never applied
         | it, so don't take my word for it.
        
         | Timon3 wrote:
         | > Like the best ideas, it's simple and makes sense if you think
         | about it, but it's still a really interesting framing that the
         | complex machinery of life is really just the most efficient
         | "entropy converter".
         | 
         | Right? I think this could be a really interesting basis for a
         | sci-fi novel - the first space-faring civilization in the
         | entire universe trying to force every form of life to keep
         | "entropy usage" to a minimum, so they can prolong their own
         | life span.
        
           | ineptech wrote:
           | Not _exactly_ what you 're describing, but this is along
           | those lines: https://slatestarcodex.com/2015/06/02/and-i-
           | show-you-how-dee... It's pretty silly but the ending is both
           | satisfying and relevant.
        
           | hunter-gatherer wrote:
           | Isaac Asmiov's short story "The Last Question" isn't super
           | inline with your suggestion but is a short discourse on
           | humanity's struggle against entropy.
        
         | franky47 wrote:
         | > a space full of life tends toward more entropy faster than
         | the same amount of space without life.
         | 
         | And a space full of children is exponentially faster at
         | increasing entropy.
        
         | quickthrower2 wrote:
         | Does Einstein's model of time care about entropy. In other
         | words if there are 2 regions, one where entropy is increasing
         | at that time and one where it isn't as much, does it affect
         | time?
        
           | danq__ wrote:
           | [dead]
        
           | semi-extrinsic wrote:
           | No. Entropy can be used to explain the direction of time, as
           | a kind of symmetry breaking of all the microscopic laws that
           | are symmetric in time. But it does not say anything about the
           | "speed of time". Relativity does tell us the speed of time -
           | it's the speed of light.
        
             | Balgair wrote:
             | To be suuuuuuper pedantic here: Relativity tells us that
             | time is a dimension, one that is a bit unique. In that it
             | has a constant attached to it. So, the 3 dimensions you're
             | used to are just normal, they have no constants.
             | 
             | (x,y,z)
             | 
             | Meters of x are meters of z and meters of y. Relativity
             | (and I'm really simplifying a lot by just saying
             | 'relativity'), well relativity comes along as says that
             | time is also a dimension, just with the constant of 'c'
             | attached (the speed of light). That way you can convert
             | seconds into meters.
             | 
             | (x,y,z,ct) not just (x,y,z,t).
             | 
             | So now the time dimension is much larger than the spatial
             | dimensions. About 300,000,000 times larger, a third-ish of
             | a billion. So a meter of x is ~1/3 of a billion meters of
             | time.
             | 
             | Now, there is a _lot_ more about relativity, like, just
             | tons. And I skipped most of it. And trying to just say that
             | time is a simple little conversion away from meters is just
             | wrong. And how that all relates to entropy is a mess that
             | we really haven 't figured out yet.
        
               | cubefox wrote:
               | The last sentence surprised me.
        
               | kergonath wrote:
               | And yet... If you want to get properly depressed, there
               | is a lot we don't understand about gravity, either.
        
             | klabb3 wrote:
             | > Relativity does tell us the speed of time - it's the
             | speed of light.
             | 
             | I forget the name of the book, but it was trying to convey
             | intuitions about relativity (first special, then general).
             | 
             | It's very easy to make the mistake of trying to understand
             | space and time first, concepts we think we intuit, but in
             | relativity these are somewhat higher-level concepts.
             | Instead, start with what the most fundamental part of the
             | theory and go from there: _the speed of light is constant_.
             | Accept that first. It's the comfort zone. You can always
             | return safely to this point.
             | 
             | So, when moving to space and time, the book explained it
             | like this: _everything_ moves at the speed of light, at all
             | times. It's just that instead of x,y,z - we add t, time, as
             | well. So for an object that's still, all it's movement is
             | through the time dimension. Conversely, an object that
             | moves incredibly fast, like a photon, already "used" it's
             | speed in the spatial dimensions, so it doesn't "age" in
             | terms of time.
             | 
             | This is just special relativity, but I liked this approach.
             | It's basically embracing the theory first instead of trying
             | to shoehorn it into the world we have so many
             | misconceptions about.
        
       | antimora wrote:
       | Just recently I also watch a video by Sabine Hossenfelder called
       | "I don't believe the 2nd law of thermodynamics":
       | https://www.youtube.com/watch?v=89Mq6gmPo0s
       | 
       | I recommend this video as well.
        
       | lisper wrote:
       | A pithier way to introduce this topic: the first law of
       | thermodynamics, a.k.a. the law of conservation of energy, is that
       | energy cannot be created nor destroyed, only transformed from one
       | form to another. In light of this, how can there ever be a
       | shortage of energy?
       | 
       | [Note that this is intended to be a rhetorical question advanced
       | for the purposes of pedagogy. If you find yourself wanting to
       | post an answer, you have missed the point.]
        
         | sghiassy wrote:
         | Usable energy is different than total energy. If energy isn't
         | concentrated (say something like gasoline) it's not usable
        
         | uoaei wrote:
         | Energy is actually a red herring -- what's relevant here is
         | work.
         | 
         | Useful work, aka information, is work that can be employed in
         | dynamics vis a vis processing. Useless work, aka heat, is the
         | devil's share of the energy expenditure which is lost as
         | entropy when undergoing a process.
        
         | dekken_ wrote:
         | > how can there ever be a shortage of energy
         | 
         | I think it's not so much a shortage of energy, but that there
         | thermodynamic equilibrium and thus no available energy to do
         | anything.
         | 
         | I don't think this will ever happen tho, it's pretty clear to
         | me that making energy more dense is a universal process.
        
           | sghiassy wrote:
           | Not really - look up "heat death" of the universe.
        
             | credit_guy wrote:
             | The "heat death" of the universe is a concept that deserves
             | to die. The second principle of thermodynamics is true only
             | if you ignore gravity. In the presence of gravity, systems
             | tend to go towards lower entropy, just see how a planetary
             | system can form out of a gas cloud.
        
               | consilient wrote:
               | > In the presence of gravity, systems tend to go towards
               | lower entropy, just see how a planetary system can form
               | out of a gas cloud.
               | 
               | This isn't correct: the entropy (and energy) of the gas
               | cloud goes decreases as it collapses, but the entropy of
               | its surroundings increases faster as it radiates.
        
               | credit_guy wrote:
               | Is that a fact? Or just a hypothetical way that could
               | save the second principle?
        
               | consilient wrote:
               | It's a fact. See for instance
               | https://arxiv.org/pdf/0907.0659.pdf
        
               | credit_guy wrote:
               | It's more like an opinion. Of one particular guy who has
               | a Ph.D. in Physics. But there are many, and there is no
               | consensus overall.
               | 
               | Here's the relevant quote from wikipedia [1]
               | Recent work has cast some doubt on the heat death
               | hypothesis and the applicability of any simple
               | thermodynamic model to the universe in general. Although
               | entropy does increase in the model of an expanding
               | universe, the maximum possible entropy rises much more
               | rapidly, moving the universe further from the heat death
               | with time, not closer. This results in an "entropy gap"
               | pushing the system further away from the posited heat
               | death equilibrium. Other complicating factors, such as
               | the energy density of the vacuum and macroscopic quantum
               | effects, are difficult to reconcile with thermodynamical
               | models, making any predictions of large-scale
               | thermodynamics extremely difficult.
               | 
               | [1] https://en.wikipedia.org/wiki/Entropy#Cosmology
        
               | consilient wrote:
               | That's discussing the effects of metric expansion, which
               | are not relevant for gravitationally bound systems. It
               | also doesn't claim the second law of thermodynamics
               | fails. On the contrary,
               | 
               | > entropy does increase in the model of an expanding
               | universe
        
               | kergonath wrote:
               | This is quite different from what you were saying about
               | gravity reducing entropy, though. And I know quite well
               | that having a PhD in Physics does not make someone right,
               | but then quoting Wikipedia in such an argument is really
               | not great.
        
               | dekken_ wrote:
               | I think of it as a death cult. Thinking we know all that
               | there is to know about the universe to the point where we
               | can declare with 100% certainty that any particular thing
               | will happen is not scientific.
        
               | kergonath wrote:
               | Really? We are far from really understanding gravity, but
               | I can very confidently tell you that if I kick a ball its
               | trajectory will be a parabola (roughly, as a first order
               | approximation and ignoring things like friction, which we
               | can also calculate to a decent approximation). We can say
               | where it will fall and give some confidence interval
               | depending on the conditions and such. There is nothing
               | unscientific about it.
               | 
               | Thermodynamics is not magic. In the same way that we can
               | predict the evolution of climate without knowing where
               | every single cloud will be, we can make statements about
               | the evolution of large systems even though our knowledge
               | of their state is imperfect. Again, nothing unscientific
               | about it.
        
               | mNovak wrote:
               | Not my area of expertise, but in the video at least, they
               | indicate that black holes have very high entropy. So if
               | we imagine gravity eventually pulling all those planetary
               | systems together into some number of black holes, isn't
               | gravity indeed pulling the system towards a high entropy
               | state?
               | 
               | The video actually directly addresses the gas cloud
               | question, saying basically that a gas cloud is actually a
               | highly improbable distribution of matter, whereas the
               | eventual planetary system is much more probable. The
               | claim being, that trend towards expected state is entropy
               | increasing.
        
             | jjaken wrote:
             | Heat death doesn't mean "no heat" or that energy has
             | depleted. It just means that energy is fully dispersed. All
             | kg the hear exists, it's just that no one place has any
             | more than anywhere else, and so there is no longer any
             | transfer of energy.
        
               | A_D_E_P_T wrote:
               | > so there is no longer any transfer of energy.
               | 
               | Ah, but there is.
               | 
               | The Second Law is a statistical law, not an absolute law.
               | On long enough timescales, low-probability fluctuations
               | in local entropy will allow for energy transfer. These
               | fluctuations will also allow for the formation of
               | structures. (Boltzmann himself was of the opinion that
               | the low-entropy universe emerged from a higher-entropy
               | background state. And indeed there's nothing in physics
               | to rule out the emergence of Boltzmann Brains and even
               | Boltzmann Galaxies from homogeneous and maximally
               | entropic universes in "heat death." This is a
               | philosophical problem of the highest order, because it
               | implies that we're not necessarily going from "less
               | likely to more likely states" as the video implies, but
               | rather from a relatively deterministic state to a
               | probabilistic state.)
        
               | jjaken wrote:
               | Oh interesting, I didn't know that
        
         | dahart wrote:
         | Why is this pithier than the video? I'm not entirely sure I see
         | added pedagogical value. Asking the rhetorical question how can
         | there be a shortage of energy sounds a little like someone
         | sort-of intentionally misunderstanding what that phrase "energy
         | shortage" means in any practical economic context. "Energy
         | shortage" is an economics phrase, not a physics phrase. The
         | first law of thermodynamics doesn't suggest there can't be
         | energy shortages on earth, because the phrase "energy shortage"
         | is not used to suggest a loss of energy to the universe, energy
         | shortages are all about not having enough specific forms of
         | energy in specific places at specific times [1], and it's no
         | surprise that we can't capture dissipated heat, or that a local
         | power system has a maximum limit at any given time, for
         | example.
         | 
         | Something similar could perhaps be said for the video's
         | approach; "what do we get from the sun?" is an ambiguous
         | question, not necessarily a fair setup to ask a lay person when
         | you have entropy in mind as the answer. We do get energy from
         | the sun, that is a correct answer, and we use some of it before
         | it goes away. But, there is the nice a-ha that all the energy
         | from the sun eventually leaves the earth, right?
         | 
         | [1] "An energy crisis or energy shortage is any significant
         | bottleneck in the supply of energy resources to an economy."
         | https://en.wikipedia.org/wiki/Energy_crisis
        
           | JumpCrisscross wrote:
           | It's pithy, but in the way of word play. Energy,
           | colloquially, means useful energy. The question collides the
           | conventional and technical definitions to create the illusion
           | of profundity.
        
             | lisper wrote:
             | Pithy != profound. The intent was to get people to think
             | about the fact that the word "energy" means different
             | things in different contexts, and that the thing that
             | actually has value is not energy but the _absense_ of
             | entropy.
        
       | pcwelder wrote:
       | So the law of increasing entropy is not a fundamental law of the
       | reality because it can be derived from other fundamental
       | equations.
       | 
       | Suppose I show you a snapshot of a random universe, would you be
       | able to tell if the entropy of the universe is going to increase
       | or decrease as the time progresses?
       | 
       | Let's assume that universe's entropy would increase. Consider
       | another universe exactly the same as current universe, but all
       | the particles' velocities reversed. Then this universe's entropy
       | would decrease.
       | 
       | So you are equally like to select both the universe and hence the
       | original assumption of increasing entropy is wrong.
       | 
       | Discarding quantum properties of the particles, is it then fair
       | to say that time's direction is unrelated to whether entropy
       | increases or decreases?
        
         | [deleted]
        
         | [deleted]
        
         | canjobear wrote:
         | The expected increase in entropy can be derived from laws of
         | mechanics plus the critical stipulation that, in the past,
         | entropy was very low. Essentially, physical systems want to be
         | in high-entropy states. So if you observe one to be in a very
         | low-entropy state, then you can conclude that with high
         | probability the future of that system will go to higher-entropy
         | states.
         | 
         | > Suppose I show you a snapshot of a random universe, would you
         | be able to tell if the entropy of the universe is going to
         | increase or decrease as the time progresses?
         | 
         | Yes, if it has low entropy then entropy will probably increase;
         | if it has high entropy then the entropy will probably fluctuate
         | up and down statistically.
         | 
         | > Let's assume that universe's entropy would increase. Consider
         | another universe exactly the same as current universe, but all
         | the particles' velocities reversed. Then this universe's
         | entropy would decrease.
         | 
         | The key is that you're exponentially unlikely to find yourself
         | in a universe where all the particles' velocities are reversed.
         | See this: https://en.wikipedia.org/wiki/Fluctuation_theorem
         | 
         | The probability that a system randomly evolves in a way that
         | reduces entropy is very very small.
        
           | cubefox wrote:
           | > > Suppose I show you a snapshot of a random universe, would
           | you be able to tell if the entropy of the universe is going
           | to increase or decrease as the time progresses?
           | 
           | > Yes, if it has low entropy then entropy will probably
           | increase
           | 
           | The problem is that it probably increases in _both_ time
           | directions, such that the state of minimum entropy is _now_.
           | As you said, we have to stipulate that the entropy in the
           | past is low, we can 't (yet?) infer it from observation.
           | Which raises the question what justifies us making this
           | assumption in the first place.
        
         | rixed wrote:
         | If by "random universe" you mean a universe in which all states
         | of every particule are random, then my understanding is that we
         | would probably conclude that entropy is neither increasing nor
         | decreasing. Our universe is not random. We could spot local
         | phenomenons where entropy is clearly going in one direction. We
         | assume everywhere the entropy would go in the same direction
         | (increasing), and deduce from this hypothesis that the universe
         | started with a very low entropy.
        
       | friend_and_foe wrote:
       | The video talks about how the earth radiates away the same amount
       | of energy as it gets from the sun, just red shifted. In light of
       | this, let's talk about climate change, global warming and the
       | greenhouse effect.
        
       | szundi wrote:
       | Just after Sabine's
        
         | lll-o-lll wrote:
         | Yes, and the Sabine video contained far more information (lower
         | entropy?) and one genuinely interesting idea I'd never heard
         | before. The idea that heat death may not be the end of
         | intelligent life! The concept this relies on is the idea that
         | macro states are actually just combinations of micro states all
         | of which have the same probability. E.g. the sequence 1, 2, 3,
         | 4, 5 and 3, 1, 4, 2, 5 are equally likely if you are selecting
         | 5 random numbers (1-5), but the ordered sequence is an
         | important state to _us_. The Big Bang - > heat death is just
         | super unlikely state to super likely state, but these macro
         | states are poorly defined. They matter to _us_. So perhaps the
         | universe goes on with complex life harvesting neg-entropy from
         | a heat death configuration of micro states as they slowly
         | transition from "extremely unlikely" to "likely" in the context
         | of something incomprehensible to humans. I feel like the idea
         | would need more mathematical meat on the bones to go further,
         | but still an intriguing thought!
        
           | cubefox wrote:
           | I suspect that "macro state that is more important to us"
           | really is special in some objective way, not just
           | subjectively.
        
       | spuz wrote:
       | I think the concept would be easier for me to understand if we
       | talked about the inverse of entropy - i.e. some kind of
       | measurement for the concentration of useful energy or "order". I
       | think it would then be more intuitive to say that this
       | measurement always decreases. Do we even have a word for the
       | opposite of entropy?
        
         | MaxRegret wrote:
         | Negentropy? This is a concept in information theory, but maybe
         | also in physics.
        
           | winwang wrote:
           | And it's closely related to the Gibbs free energy (available
           | energy), which decreases with increasing entropy, all other
           | things equal.
        
         | vehicles2b wrote:
         | I find it intuitive to think of such "ordered" distributions as
         | having a higher compression ratio. (Ie compare the file sizes
         | of zipping a file with just ones or just zeros vs zipping a
         | file with a uniform, random mixture of ones and zeros.)
        
           | Solvency wrote:
           | Huh? I've never really understood this metaphor.
           | 
           | Take any photograph in Photoshop. First, save one copy of it
           | as a compressed JPG.
           | 
           | Now, on the original, add a small densely repeating tiled
           | pattern multiplied on top as a layer. Like a halftone effect,
           | dot texture, whatever. Technically you're adding more order
           | and less chaos. The resulting image won't compress as
           | efficiently.
        
             | dist-epoch wrote:
             | The idea is to use the best compressor possible. So called
             | Kolmogorov complexity.
        
         | [deleted]
        
         | knolan wrote:
         | Exergy
        
         | syntaxing wrote:
         | Not trying to be cheeky but wouldn't the opposite of entropy be
         | work?
        
         | cjs_ac wrote:
         | Ectropy has been suggested, but the term is not in common use.
        
         | danq__ wrote:
         | [dead]
        
         | Timon3 wrote:
         | I might be completely off kilter here (please tell me if that's
         | the case!), but what makes sense to me is to think about "how
         | much entropy is left", as in "how much can entropy still
         | increase between the current state and the highest entropy
         | state". That flips the meaning to describe what you're talking
         | about, and feels very intuitive to me.
        
           | spuz wrote:
           | When you say "how much entropy is left?" you make it sound
           | like a quantity that is decreasing, not increasing. That
           | seems incorrect, no?
        
             | Timon3 wrote:
             | That's why I specified that it's meant as "how much can
             | entropy still increase between the current state and the
             | highest entropy state". If you can't read it in that sense,
             | either ignore the shorter question or read it as "how much
             | increase of entropy left". Or maybe "how much entropy left
             | until max".
        
               | nobody9999 wrote:
               | >"how much entropy left until max".
               | 
               | 10^106 years[0] worth or so.
               | 
               | [0]
               | https://en.wikipedia.org/wiki/Heat_death_of_the_universe
        
         | kledru wrote:
         | we do have a word -- "negentropy". Physicists also sometimes
         | find it easier to talk about "negentropy" instead of entropy.
        
         | bob1029 wrote:
         | > Do we even have a word for the opposite of entropy?
         | 
         | I've always referred to the inverse as "information" or
         | "order".
        
           | cshimmin wrote:
           | "information" is not a good term for the opposite. Entropy is
           | a well defined concept in information theory (and can be
           | connected to the physical concept). More entropy means more
           | information, not less.
        
             | bob1029 wrote:
             | I agree. Order is a better term to describe it.
             | Predictability. More entropy = more total possible states
             | the system can be in.
             | 
             | I tend to use "information" to refer to a statistically-
             | significant signal or data that the application/business
             | can practically utilize. This is definitely not the same as
             | the strict information theoretical definition.
        
             | kgwgk wrote:
             | > Entropy is a well defined concept in information theory
             | (and can be connected to the physical concept). More
             | entropy means more information, not less.
             | 
             | More entropy means more missing information.
             | 
             | https://iopscience.iop.org/book/mono/978-0-7503-3931-5/chap
             | t...
        
               | kikokikokiko wrote:
               | As I understand it in information theory, "more entropy"
               | equals "more information is necessary to fully describe
               | this thing", but I may be wrong.
        
               | kgwgk wrote:
               | "more [additional] information is necessary to fully
               | describe this thing" = "more missing information [in the
               | incomplete description of the thing]"
        
         | dangitnotagain wrote:
         | Potential
        
         | tnecniv wrote:
         | Normally we care not about entropy but changes in entropy or
         | entropy measured in comparison to a reference distribution, so
         | you can just take the negative of that difference.
        
       | danq__ wrote:
       | The most intuitive explanation of entropy ever:
       | 
       | entropy is a fancy way of explaining probability.
       | 
       | Things with higher probability tend to occur over things of lower
       | probability.
       | 
       | Thus when certain aspects of the world like configurations of gas
       | particles in a box are allowed to change configurations, they
       | will move towards high probability configurations.
       | 
       | High probability configurations tend to be disordered. Hence the
       | reason why we associate entropy with things becoming increasingly
       | disordered. For example... gas particles randomly and evenly
       | filling up an entire box is more probable then all gas particles
       | randomly gathering on one side of the box.
       | 
       | If you understand what I just explained than you understand
       | entropy better than the majority of people.
        
       | guerrilla wrote:
       | It's actually amazing how bad science popularizers are at
       | explaining entropy. It's really not that difficult if they would
       | just take some time and think before they speak (which I'm sure
       | this video proves based on what people are saying about it.)
        
       | martythemaniak wrote:
       | Great video, but very wrong to cite Jeremy England. Ilya
       | Prigogine came up with the concept of Dissipative Structures and
       | won the Nobel Prize in Chemistry in 1977 for that work. He also
       | has a couple of pop-sci books on he subject that I found super
       | illuminating. They are a little bit challenging to read, but
       | they're very thorough and Order out of Chaos in particular has a
       | fantastic summary of 400 years of philosophy of science. Highly
       | recommend reading the OG.
        
       | ctafur wrote:
       | According to entropy, and thermodynamics in general, I can't
       | recommend enough the notes [0] of prof. Zhigang Suo of Harvard.
       | It's a new way of presenting thermodynamics and I finally get
       | it... contrary to when I took a thermo course at university.
       | 
       | [0]:
       | https://docs.google.com/document/d/10Vi8s-azYq9auysBSK3SFSWZ...
       | 
       | Also prof. Suo puts entropy as the main character of the "play".
       | The other concepts (temperature, etc.) are defined from entropy.
        
       | manojlds wrote:
       | When I was watching the video I was thinking this deserves to be
       | posted on HN and yup, someone already has.
        
       | hilbert42 wrote:
       | That's an excellent overall summary as he covers almost every
       | aspect of the subject albeit in brief. It would be good if he
       | produced a second video dealing with the low entropy of incoming
       | energy from the sun and the higher entropy of radiated energy
       | from earth and relate that to global warming.
       | 
       | In all the debate over global warming little is talked about why
       | say CO2 and other greenhouse gasses increase the earth's
       | temperature and how they shift the wavelength of the radiated
       | energy from earth. In other words we need to explain in simple
       | terms why the incoming and outgoing energy can remain the same
       | yet the earth's temperature has increased.
        
         | jjaken wrote:
         | The important part to understand is timescales. In a day, the
         | Earth does absorb some energy. Of course it does, plants
         | collect it, solar panels collect, the ocean and land collect
         | it. The amount Earth collects is a tiny fraction of what it
         | releases. That collection isn't permanent though and is slowly
         | released. Within a day, the Earth absorbs some energy, but over
         | a long enough timescale, all of that energy is released again.
         | 
         | The Earth is taking on energy every day from the sun. If we
         | didn't release it all back, the earth would be warming much
         | much faster. It only remains relatively cool because it
         | releases almost as much as it receives.
         | 
         | Another important note is that long term energy is not only
         | stored as heat on earth. It's stored as potential energy in the
         | atoms of cells in plants and animals. Think of how cold a
         | gallon of gasoline is, yet how much energy it stores.
         | 
         | For an example think of hot asphalt from a summer day. It gets
         | real hot all day and slowly cools down at night. Sometimes it
         | can be pretty warm to stand on the road even if it's a cool
         | night.
         | 
         | Within the human timescale, the Earth is retaining some (tiny
         | fraction) of heat. That tiny fraction of heat is a very small
         | window of heat that life can tolerate. It's not too much and
         | not too little. If the earth were to retain just a tiny bit
         | more, suddenly life can't tolerate it. On the scale of the
         | universe, the difference between those realities is minuscule,
         | even though it's enormous to us.
        
         | gorhill wrote:
         | My understanding:
         | 
         | Global warming occurs because the previous equilibrium between
         | incoming and outgoing energy has been broken by changes in the
         | composition of the atmosphere.
         | 
         | So until we reach a new equilibrium long after the atmosphere
         | composition ceases to change, the outgoing energy will be less
         | than the incoming energy.
        
           | kergonath wrote:
           | That's exactly it. Except that equilibrium will never be
           | reached, it's more like tending towards a steady state.
        
         | willis936 wrote:
         | Which debate over global warming are you referring to? There
         | are debates that involve atmospheric chemists that discuss
         | Earth darkening cause and effects.
         | 
         | https://en.m.wikipedia.org/wiki/Albedo
        
           | hilbert42 wrote:
           | Right, the average person hasn't a clue about albedo, nor do
           | they know why say CO2 and CH4 increase global warming whereas
           | others such as O2 are more benign.
           | 
           | It may help lower the temperature of the debate if they did.
           | 
           |  _Edit, we 're pitching this discussion at the level he has--
           | the lay public. Scientific argument over the minutiae is
           | another matter altogether._
        
             | jjaken wrote:
             | Yeah debates in climate science are about phenomena
             | laypeople don't even know exist. It's about how what we
             | observe happens. No one is arguing about what we're
             | observing, eg global warming.
        
       | lvl102 wrote:
       | I'd like to think of entropy in terms of randomness or rather
       | uniqueness of elements/compounds within a system.
        
       | 4ad wrote:
       | I hate Veritasium's clickbait, and I think most of his videos are
       | very poor, but this one is the exception. It's very well put
       | together. The first ten minutes of the video is _exactly_ how I
       | introduce entropy to people.
       | 
       | Of course I can't give him a pass on how crass it was telling
       | that women he has a PhD in physics (he does not). The video would
       | have been so much better without that two seconds of footage...
        
         | [deleted]
        
         | [deleted]
        
         | manojlds wrote:
         | Most of his videos are GOOD, come on!
        
           | kergonath wrote:
           | The problem is that on a given specific subject you can never
           | be sure whether he's exaggerating or misrepresenting things.
           | Just this makes watching him a waste of time, because then
           | you need to spend at least twice the time to fact check him.
           | A bit like asking a question to ChatGPT. At least Wikipedia
           | provides you links to proper sources.
        
         | hilbert42 wrote:
         | _" I think most of his videos are very poor,"_
         | 
         | Why do you think so? (Most to me seem reasonable but one on
         | speed of electricity stands out as badly done (he redid the
         | video but it too could have been better).)
        
           | pxeger1 wrote:
           | He seems to exaggerate the importance of things when they
           | make for a good story and sound interesting. This is a
           | classic flaw in popular science but I think he's got a lot
           | more egregious with it over the years.
           | 
           | The worst example I remember, which is actually what drove me
           | to unsubscribe, was when he said that the golden ratio was "a
           | pretty five-y number" because it can be written as 0.5 + 0.5
           | * (5^0.5). Anyone with a good mathematical background could
           | tell you there's nothing five-y about 0.5 at all. I'll grant
           | him, the golden ratio is still a little bit five-y because of
           | the sqrt(5).
           | 
           | The whole context and presentation seemed like it was
           | designed to make the viewer feel like they'd learnt something
           | even though nothing of substance was really delivered in
           | those 20 seconds. He does that a lot.
        
             | interestica wrote:
             | > Anyone with a good mathematical background could tell you
             | there's nothing five-y about 0.5 at all.
             | 
             | Um, I disagree? Visually, the 5 is memorable here. "Fivey"
             | just seems to mean "lots of the number 5"?
        
               | pxeger1 wrote:
               | I think the context was perhaps more important than I'd
               | considered to explain the significance of "five-y". The
               | implication was that the fiviness gives something of an
               | intuitive explaination for why some shape had 5 sides.
               | The presence of a [?]5 does (maybe) do this, but 0.5
               | definitely doesn't. (Because as sibling comment points
               | out, 0.5 = 1/2 and the 5 only appears due to our
               | (arbitrary, from a mathematical point of view) choice of
               | base ten.
        
               | koningrobot wrote:
               | I think the point is that 0.5 is 1/2, and the 5 digit
               | only appears because of our base 10 presentation of
               | numbers.
        
           | AlexandrB wrote:
           | I was quite disappointed by his video on self-driving cars.
           | It presented a very one-sided view and felt like a puff piece
           | (and, indeed, it was sponsored by Waymo). Tom Nicholas did a
           | good job breaking down the problems with it: https://m.youtub
           | e.com/watch?v=CM0aohBfUTc&pp=ygUPdG9tIHZlcml...
        
           | hospitalJail wrote:
           | His dandruff ad was pretty cringe IMO. But idk, it seems like
           | his videos are really long for what they accomplish in
           | general. He seems like the generic youtuber that milks every
           | dollar of ad revenue and is shameless about it.
           | 
           | Kind of sad that a expensive camera + clickbait
           | thumbnail/title > Experts communicating clearly and
           | accurately.
           | 
           | I imagine he/his team is scouring youtube for the experts,
           | and remaking their videos with more production value.
        
           | willis936 wrote:
           | His video on electromagnetism is still my gold standard for
           | Lorentz's Law. It came out while I was taking emag. I liked
           | it so much I showed it to my professor, who didn't name drop
           | Lorentz all semester. The class was making sure no one got a
           | BSEE without knowing Maxwell's Equations, which does warrant
           | a semester. I guess it was more of a failing of the physics
           | curriculum.
        
         | andyjohnson0 wrote:
         | > course I can't give him a pass on how crass it was telling
         | that women he has a PhD in physics (he does not)
         | 
         | To be clear, he does have a PhD but it is in physics education
         | research, not physics.
        
         | PrimeMcFly wrote:
         | > I think most of his videos are very poor,
         | 
         | This sounds quite bitter, as does griping about him mentioning
         | his PhD.
         | 
         | His videos have excellent production quality, and do a great
         | job of communicating advanced STEM concepts to laypeople in an
         | entertaining way.
         | 
         | Maybe you don't _like_ them, but that doesn 't mean they are
         | bad. Given their popularity, it would seem they are anything
         | but.
        
       | hospitalJail wrote:
       | Last night me and my wife were deciding if we should watch The
       | Witcher or this video.
       | 
       | I decided I didn't have the brainpower/mental capacity to think
       | about The Witcher and that this video on Entropy would be easier
       | to digest.
        
         | SanderNL wrote:
         | Not sure if you are humble bragging, but I agree The Witcher is
         | more demanding. Layers of meaning, emotions, allegory, subtext.
         | It's no Shakespeare, but physics and math are simple in
         | comparison especially in the wonderfully produced and easily
         | digestible format of Veritasium.
        
           | Balgair wrote:
           | I mean, the author has a doctoral degree in science
           | communication. It's his job and the point of the channel to
           | try to make things easy to understand.
           | 
           | The opposite is true with fiction. You're intentionally
           | trying to have the audience make connections themselves, like
           | a Sherlock story or the Great Gatsby. The point is in the
           | discovery by the viewer.
        
       | javajosh wrote:
       | Entropy only made sense when I learned it from the perspective of
       | statistical thermodynamics. It's a very programmerly
       | understanding, IMHO, and it's quite intuitive. EXCEPT that the
       | language used is ridiculous: _grand canonical ensemble_ indeed!
       | Anyway, the idea that a system can be in some number of specific
       | states, and that equilibrium is that unique situation where the
       | number of possible specific states is at its maximum, really
       | spoke to me.
        
         | dekhn wrote:
         | I took a stat thermo class and it was basically all about
         | entropy, which was expresed as ln W- the log of the number of
         | ways (permutations) that a system can be ordered, which gives a
         | convenient denominator when calculating the probability of a
         | specific permutation. Here's the professor's book, which was
         | still only in latex form when we took the class:
         | https://www.amazon.com/Molecular-Driving-Forces-Statistical-...
        
           | javajosh wrote:
           | yes, there are lots of quantitative details. I wanted to
           | emphasize the key qualitative concept, from which the others
           | can derive. In a similar way you can derive all of special
           | relativity, and approach an intuition about the strangeness
           | of spacetime, starting with only two ideas: the laws of
           | physics are the same in all reference frames; the speed of
           | light is constant. I prefer to start there and derive e.g.
           | Lorentz factors than start with the mathy stuff.
        
             | kergonath wrote:
             | > starting with only two ideas: the laws of physics are the
             | same in all reference frames; the speed of light is
             | constant.
             | 
             | Isn't this redundant, though? The constant velocity for
             | light in a vacuum comes directly from the laws of
             | (classical) electromagnetism in the form of Maxwell's
             | equations. So "the laws of Physics are the same in all
             | reference frames" implies "Maxwell's equations are valid in
             | all reference frames", which in turn implies "the velocity
             | of light in vacuum is the same in all reference frames".
             | That's what I understood reading Einstein himself.
             | 
             | I think it's much stronger that way. Otherwise we get to
             | why light should be a special case, which is difficult to
             | defend. The constant velocity of light (in vacuum) being an
             | unavoidable consequence of the laws of Physics makes it
             | much stronger.
             | 
             | > I prefer to start there and derive e.g. Lorentz factors
             | than start with the mathy stuff.
             | 
             | That's how Einstein himself explained it (with trains and
             | stuff, but still) and it makes a lot of sense to me. Much
             | more than the professor who did start with the Lorentz
             | transform and then lost everyone after 20 minutes of maths.
        
         | guga42k wrote:
         | If somebody needs to build an intuition about entropy he could
         | think about simple problem.
         | 
         | You are given insulated cylinder with a barrier in the middle.
         | Left side of the cylinder filled with ideal gas A, and the
         | right side filled with gas B. If given a particle one can
         | distinguish A from B. The pressure and temperature on both
         | sides are the same. Then you remove the barrier and gases mix.
         | Question: how much work you need to do to revert the system
         | into the original state? Hint: the work is equal to entropy
         | difference between two states.
         | 
         | More generally, if you have proper insulated system and leave
         | it be for a while. All of sudden you will have to do some work
         | to come back to the original state despite energy conservation
         | law holds.
        
           | kuchenbecker wrote:
           | Enter Maxwell's demon as a completely valid solution to this
           | problem showing you can decrease entropy within that system
           | (but you need to exclude the demon from the system).
        
           | ithkuil wrote:
           | If you need to do work in order to revert to the previous
           | state, does it imply you can extract work when going to the
           | first to the second state?
           | 
           | Given the scenario you just laid out it seems no work can be
           | extracted just by letting mix two substances that are at the
           | same temperature and pressure. But there is something about
           | it that doesn't quite add up to my intuition of symmetry and
           | conservation laws. Could you please elaborate more on that?
        
             | guga42k wrote:
             | >If you need to do work in order to revert to the previous
             | state, does it imply you can extract work when going to the
             | first to the second state?
             | 
             | Nope. The work comes from the system coming from ordered
             | state into unordered. Why the problem above is good for
             | intuition because you can work out how to reverse the
             | state. You invent semi-magical barrier which is fully
             | transparent for particles A and reflects particles B, then
             | you start to push such barrier from left to right up to the
             | middle, compressing gas B (and making work!) and leave left
             | part with gas A only, then repeat similar exercise on the
             | right side.
             | 
             | >Given the scenario you just laid out it seems no work can
             | be extracted just by letting mix two substances that are at
             | the same temperature and pressure. But there is something
             | about it that doesn't quite add up to my intuition of
             | symmetry and conservation laws. Could you please elaborate
             | more on that?
             | 
             | As far as I understand this asymmetry was the exact reason
             | why entropy was introduced. Then later explained by
             | Boltzmann via a measure of number of microscopic states.
             | 
             | Naturally second law of thermodynamics forbids perpetual
             | engines.
        
             | Lichtso wrote:
             | I think you can very well extract work from having a
             | membrane and selectively let one substance mix into the
             | other but not the other in the first [0]. It is called
             | Osmosis [1].
             | 
             | [0]: https://en.wikipedia.org/wiki/Semipermeable_membrane
             | [1]: https://en.wikipedia.org/wiki/Osmosis
        
               | ithkuil wrote:
               | I guess what's confusing me in this scenario is that
               | we're not saying that the two halves of the cylinder
               | contain particles with different properties (e.g.
               | different velocities) but only that we can "tell them
               | apart" as if they were coloured differently, but
               | otherwise behaving in exactly the same way.
               | 
               | The former scenario is famously the setting for Maxwell's
               | daemon. I was assuming this scenario is something else.
               | 
               | I'm confused because on one hand I can see that it
               | requires work to reorder the particles once they have
               | been shuffled around. On the other hand I don't see how
               | one could extract work while they get shuffled around if
               | they all have the same momenta.
               | 
               | Perhaps the answer is that we cannot have a system where
               | the microscopic entities are at the same
               | indistinguishable but also distinguishable. Perhaps if
               | they had different "colours' it means they do interact
               | differently with the environment? I'm still confused
               | frankly
        
         | passion__desire wrote:
         | Entropy is mathematical force to be honest.
        
           | canadianfella wrote:
           | [dead]
        
           | Solvency wrote:
           | Huh? "Force"?
        
             | esafak wrote:
             | concept
        
               | danq__ wrote:
               | [dead]
        
       | TexanFeller wrote:
       | I recently enjoyed this presentation by Sean Carroll that touches
       | on definitional and philosophical issues with entropy. The talk
       | made me feel less stupid for not feeling entirely comfortable
       | with how entropy was explained to me before. Turns out there are
       | a few different ways to define and quantify entropy that are used
       | in different contexts and they each have some unresolved
       | philosophical issues.
       | 
       | "Can you get time out of Quantum Mechanics?":
       | https://youtu.be/nqQrGk7Vzd4
        
         | kergonath wrote:
         | I would recommend reading some Carlo Rovelli, it sounds like
         | something you might like.
        
         | cubefox wrote:
         | Another surprising thing is that physicists have not yet
         | succeeded in reducing the ordinary notion of cause and effect
         | to fundamental physics. Carroll has also worked on this issue.
        
         | m3affan wrote:
         | I wonder how many concepts that are so complicated for our
         | brain to formalize or even process.
        
       | sourcecodeplz wrote:
       | It was a great video
        
         | [deleted]
        
       | dist-epoch wrote:
       | Sabine Hossenfelder also had a video recently on entropy:
       | 
       | > I don't believe the 2nd law of thermodynamics.
       | 
       | https://www.youtube.com/watch?v=89Mq6gmPo0s
        
         | evouga wrote:
         | What I really like about this explanation is that it highlights
         | the fact that entropy is not a natural property of the physics
         | system: entropy is only defined with respect to some coarse-
         | graining operation applied by an imperfect observer of the
         | system. So as Sabine points out it seems we should really be
         | talking about multiple different entropies, each of which
         | corresponds to a different mechanism for coarse-graining
         | microstates into macrostates, with each different entropy
         | changing at different rates depending on the coarse-graining
         | mechanism and physical system. (And in particular, God
         | observing the universe would not see entropy change at all;
         | even if there were uncertainty in the initial conditions of the
         | universe, God would see that uncertainty perfectly propagated
         | with no loss of information, in a way made precise by
         | Liouville's Theorem.)
         | 
         | But even this is not the full story, because I can take a mass-
         | spring network, and no matter how I choose to coarse-grain it,
         | I will not see the entropy corresponding to that coarse-
         | graining increase, because the trajectory of a mass-spring
         | system is periodic. Entropy increase requires that the system
         | is ergodic with respect to the chosen coarse-graining
         | operation, i.e. that over long times the trajectory visits the
         | coarse-grained states in a "random" and uniform way. It's not
         | at all obvious to me why the dynamics of particles bouncing
         | around in a box have this property, and particles attached in a
         | mass-spring network do not; and neither the Sabine nor the
         | Veritaserum videos address this or why we should expect all
         | practical real-world physical systems to be ergotic with
         | respect to practical coarse-graining mechanisms.
        
           | dist-epoch wrote:
           | > mass-spring system is periodic
           | 
           | I don't pretend to understand this stuff, but wouldn't a real
           | mass-spring system slowly stop, due to friction, air
           | resistance, heat dissipation, ...? So a real system wouldn't
           | be periodic.
        
             | consilient wrote:
             | Yes, but they're talking about an idealized harmonic
             | oscillator, not a physical mass-spring system.
        
             | sidlls wrote:
             | Periodic doesn't mean perpetual, perfect, constant periodic
             | motion (in general).
        
       | thumbuddy wrote:
       | In my opinion the most misunderstood concept from physics is
       | probably any exponential relationship. I realize we could view
       | entropy to be one of those if we flip the relationship around and
       | equate for microstates making my statement the superset. But
       | generally speaking, Ive seen both lay people and experts struggle
       | to reason about them, especially with complex numbers.
        
       | dangitnotagain wrote:
       | Entropy should be redefined as "the distribution of potential
       | over negative potential."
       | 
       | Whether discussing what is over what may be, or thermal
       | equilibrium, potential distribution describes it all!
        
       | antimora wrote:
       | The video did not explain why the sun is a low entropy source. I
       | found this explaining what I am sharing with you:
       | 
       | So, the sun is a low-entropy source of energy, and Earth (and
       | everything on it) increases that entropy as it uses and then
       | reradiates that energy. This process is entirely consistent with
       | the second law of thermodynamics.
       | 
       | The relationship between light frequency and entropy comes from
       | the fact that entropy is a measure of disorder or randomness.
       | High-frequency light, such as ultraviolet or visible light, is
       | more ordered and less random than lower-frequency light, such as
       | infrared or microwave light.
       | 
       | This is due to how light is structured. Light is made up of
       | particles called photons, and each photon carries a certain
       | amount of energy. The energy of a photon is directly proportional
       | to its frequency: higher-frequency photons carry more energy than
       | lower-frequency ones.
       | 
       | So, if you have a fixed amount of energy to distribute among
       | photons, you can do so in many more ways (i.e., with higher
       | entropy) if you use low-energy, low-frequency photons. That's
       | because you would need many more of them to carry the same total
       | amount of energy.
       | 
       | On the other hand, if you use high-energy, high-frequency
       | photons, you would need fewer of them to carry the same total
       | amount of energy. There are fewer ways to distribute the energy
       | (i.e., lower entropy), so this arrangement is more ordered and
       | less random.
       | 
       | Therefore, high-frequency light is considered a lower-entropy
       | form of energy compared to low-frequency light, because the
       | energy is concentrated in fewer, more energetic photons.
        
         | kgwxd wrote:
         | > The video did not explain why the sun is a low entropy
         | source.
         | 
         | Laymen to the extreme but, didn't it? The thing about the low
         | entropy of the universe near the big bang, gravity naturally
         | bringing things together, and such?
        
         | guga42k wrote:
         | >The video did not explain why the sun is a low entropy source.
         | I found this explaining what I am sharing with you:
         | 
         | to my best understanding, to go from high entropy state to low
         | entropy state you need work to do. The sun is a source of
         | energy to do the work
        
       | rssoconnor wrote:
       | While this is a reasonable historical explanation of entropy, and
       | explains that we don't gain net energy from the sun, it still
       | misses the mark on what entropy is now known to be.
       | 
       | Entropy isn't a property of an object, or a system or things in
       | physics. Entropy is a property of our _description_ of systems.
       | More precisely it is a measure of how poorly a given
       | specification of a physical system is, i.e. given description of
       | a systems, typically the pressure / volume / temperature of a gas
       | or whatnot, how many different physical systems correspond to
       | such a description.
       | 
       | In particular, _thermodynamic entropy is Shannon entropy_.
       | 
       | In the case where the description of state specifies a volume of
       | phase space wherein a physical state lies within, then the
       | entropy is the logarithm of the volume of this fragment of phase
       | space. If we take this collection of states and see how they
       | evolve in time, then Liouville's theorem says the volume of phase
       | space will remain constant.
       | 
       | If we want to build a reliable machine, i.e. an engine, that can
       | operate in any initial state that is bounded by our description,
       | and ends up win a final state bounded by some other description,
       | well, in order for this machine to preform reliably, the volume
       | of the final description needs to be greater than the volume of
       | the description of the initial state. Otherwise, some possible
       | initial states will fail to end up in the desired final state.
       | This is the essence of the second law of thermodynamics.
       | 
       | I want to emphasis this: entropy exists in our heads, not in the
       | world.
       | 
       | E.T. Jaynes illustrated this "5. The Gas Mixing Scenario
       | Revisited" in
       | https://www.damtp.cam.ac.uk/user/tong/statphys/jaynes.pdf where
       | two imaginary variants of Argon gas are mixed together. If one
       | engineer is ignorant of the different variants of Argon gas, it
       | is impossible to extract work from the gas, but armed with
       | knowledge of the difference (which must be exploitable otherwise
       | they wouldn't actually be different) work can be extracted.
       | 
       | Knowledge _is_ power.
       | 
       | Taking an extreme example, suppose we have two volumes of gas at
       | different volumes / pressures / temperature. We can compute how
       | much work can be extracted from those gases.
       | 
       | But, suppose someone else knows more than just the volume /
       | pressure / temperature of these gases. This someone happens to
       | know the precise position and velocity of every single molecule
       | of gas (more practically they know the quantum state of the
       | system). This someone now gets to play a the role of Maxwell's
       | demon and separate all the high velocity and low velocity
       | molecules of each chamber, opening and closing a gate using their
       | perfect knowledge of where each particle is at each moment in
       | time. From this they can now extract far more work than the
       | ignorant person.
       | 
       | In both cases the gas was identical. How much useful work one can
       | extract depends on how precise one's knowledge of the state of
       | that gas is.
        
         | cubefox wrote:
         | Hossenfelder says the same in her entropy video. A really
         | interesting hypothesis.
        
         | pcwelder wrote:
         | This is eye opening. Thanks a lot for this comment and linking
         | the pdf. I loved E.T. Jayne's Probability Theory book, so
         | looking forward to reading this pdf too.
        
           | kgwgk wrote:
           | There are a few chapters of an unpublished book on
           | thermodynamics here: https://bayes.wustl.edu/etj/thermo.html
           | 
           | This article is also interesting: "THE EVOLUTION OF CARNOT'S
           | PRINCIPLE" https://bayes.wustl.edu/etj/articles/ccarnot.pdf
           | 
           | Building on these ideas, the first five chapters of this
           | (draft of a) book from Ariel Caticha are quite readable:
           | https://www.arielcaticha.com/my-book-entropic-physics
        
         | ko27 wrote:
         | Entropy is very much "real" and it exists outside of our mind.
         | The resolution to Maxwell's demon is that knowledge of every
         | particle's state is not free, you need to increase the system's
         | entropy by obtaining knowledge more than you can ever eliminate
         | by opening chamber doors.
         | 
         | If it only existed in our minds and not in physical reality
         | that would mean it would be possible to construct a device that
         | decreases global entropy on average.
        
       | [deleted]
        
       | Lichtso wrote:
       | I think the most unintuitive even unsettling aspect of entropy is
       | that the entropy of black holes is proportional to their surface
       | area, not their volume [0]. That is only briefly mentioned in the
       | video and not discussed any further.
       | 
       | [0]
       | https://en.wikipedia.org/wiki/Holographic_principle#Black_ho...
        
       | max_ wrote:
       | Sad he didn't talk about Shannon Entropy
        
       | danq__ wrote:
       | [dead]
        
       ___________________________________________________________________
       (page generated 2023-07-02 23:00 UTC)