[HN Gopher] What Is Entropy?
       ___________________________________________________________________
        
       What Is Entropy?
        
       Author : ainoobler
       Score  : 101 points
       Date   : 2024-07-22 18:33 UTC (4 hours ago)
        
 (HTM) web link (johncarlosbaez.wordpress.com)
 (TXT) w3m dump (johncarlosbaez.wordpress.com)
        
       | illuminant wrote:
       | Entropy is the distribution of potential over negative potential.
       | 
       | This could be said "the distribution of what ever may be over the
       | surface area of where it may be."
       | 
       | This is erroneously taught in conventional information theory as
       | "the number of configurations in a system" or the available
       | information that has yet to be retrieved. Entropy includes the
       | unforseen, and out of scope.
       | 
       | Entropy is merely the predisposition to flow from high to low
       | pressure (potential). That is it. Information is a form of
       | potential.
       | 
       | Philosophically what are entropy's guarantees?
       | 
       | - That there will always be a super-scope, which may interfere in
       | ways unanticipated;
       | 
       | - everything decays the only mystery is when and how.
        
         | mwbajor wrote:
         | All definitions of entropy stem from one central, universal
         | definition: Entropy is the amount of energy unable to be used
         | for useful work. Or better put grammatically: entropy describes
         | the effect that not all energy consumed can be used for work.
        
           | ajkjk wrote:
           | There's a good case to be made that the information-theoretic
           | definition of entropy is the most fundamental one, and the
           | version that shows up in physics is just that concept as
           | applied to physics.
        
             | rimunroe wrote:
             | My favorite course I took as part of my physics degree was
             | statistical mechanics. It leaned way closer to information
             | theory than I would have expected going in, but in
             | retrospect should have been obvious.
             | 
             | Unrelated: my favorite bit from any physics book is
             | probably still the introduction of the first chapter of
             | "States of Matter" by David Goodstein: "Ludwig Boltzmann,
             | who spent much of his life studying statistical mechanics,
             | died in 1906, by his own hand. Paul Ehrenfest, carrying on
             | the work, died similarly in 1933. Now it is our turn to
             | study statistical mechanics."
        
             | galaxyLogic wrote:
             | That would mean that information-theory is not part of
             | physics, right? So, Information Theory and Entropy, are
             | part of metaphysics?
        
               | ajkjk wrote:
               | Well it's part of math, which physics is already based
               | on.
               | 
               | Whereas metaphysics is, imo, "stuff that's made up and
               | doesn't matter". Probably not the most standard take.
        
             | imtringued wrote:
             | Yeah, people seemingly misunderstand that the entropy
             | applied to thermodynamics is simply an aggregate statistic
             | that summarizes the complex state of the thermodynamic
             | system as a single real number.
             | 
             | The fact that entropy always rises etc, has nothing to do
             | with the statistical concept of entropy itself. It simply
             | is an easier way to express the physics concept that
             | individual atoms spread out their kinetic energy across a
             | large volume.
        
           | ziofill wrote:
           | I think what you describe is the application of entropy in
           | the thermodynamic setting, which doesn't apply to "all
           | definitions".
        
           | mitthrowaway2 wrote:
           | This definition is far from universal.
        
         | ziofill wrote:
         | > Entropy includes the unforseen, and out of scope.
         | 
         | Mmh, no it doesn't. You need to define your state space,
         | otherwise it's an undefined quantity.
        
           | kevindamm wrote:
           | But it is possible to account for the unforseen (or out-of-
           | vocabulary) by, for example, a Good-Turing estimate. This
           | satisfies your demand for a fully defined state space while
           | also being consistent with GP's definition.
        
           | illuminant wrote:
           | You are referring to the conceptual device you believe bongs
           | to you and your equations. Entropy creates attraction and
           | repulsion, even causing working bias. We rely upon it for our
           | system functions.
           | 
           | Undefined is uncertainty is entropic.
        
             | fermisea wrote:
             | Entropy is a measure, it doesn't create anything. This is
             | highly misleading.
        
         | axblount wrote:
         | Baez seems to use the definition you call erroneous: "It's easy
         | to wax poetic about entropy, but what is it? I claim it's the
         | amount of information we don't know about a situation, which in
         | principle we could learn."
        
         | eoverride wrote:
         | This answer is as confident as it's wrong and full of
         | gibberish.
         | 
         | Entropy is not a "distribution", it's a functional that maps a
         | probability distribution to a scalar value, i.e. a single
         | number.
         | 
         | It's the mean log-probability of a distribution.
         | 
         | It's an elementary statistical concept, independent of physical
         | concepts like "pressure", "potential", and so on.
        
           | illuminant wrote:
           | It sounds like log-probability is the manifold surface area.
           | 
           | Distribution of potential over negative potential. Negative
           | potential is the "surface area", and available potential
           | distributes itself "geometrically". All this is iterative
           | obviously, some periodicity set by universal speed limit.
           | 
           | It really doesn't sound like you disagree with me.
        
       | Jun8 wrote:
       | A well known anecdote reported by Shannon:
       | 
       | "My greatest concern was what to call it. I thought of calling it
       | 'information,' but the word was overly used, so I decided to call
       | it 'uncertainty.' When I discussed it with John von Neumann, he
       | had a better idea. Von Neumann told me, 'You should call it
       | entropy, for two reasons. In the first place your uncertainty
       | function has been used in statistical mechanics under that name,
       | so it already has a name. In the second place, and more
       | important, no one really knows what entropy really is, so in a
       | debate you will always have the advantage.'"
       | 
       | See the answers to this MathOverflow SE question
       | (https://mathoverflow.net/questions/403036/john-von-neumanns-...)
       | for references on the discussion whether Shannon's entropy is the
       | same as the one from thermodynamics.
        
         | BigParm wrote:
         | Von Neumann was the king of kings
        
       | dekhn wrote:
       | I really liked the approach my stat mech teacher used. In nearly
       | all situations, entropy just ends up being the log of the number
       | of ways a system can be arranged
       | (https://en.wikipedia.org/wiki/Boltzmann%27s_entropy_formula)
       | although I found it easiest to think in terms of pairs of dice
       | rolls.
        
         | petsfed wrote:
         | And this is what I prefer too, although with the clarification
         | that its the number of ways that a system can be arranged
         | _without changing its macroscopic properties_.
         | 
         | Its, unfortunately, not very compatible with Shannon's usage in
         | any but the shallowest sense, which is why it stays firmly in
         | the land of physics.
        
         | abetusk wrote:
         | Also known as "the number of bits to describe a system". For
         | example, 2^N equally probable states, N bits to describe each
         | state.
        
       | Tomte wrote:
       | PBS Spacetime's entropy playlist:
       | https://youtube.com/playlist?list=PLsPUh22kYmNCzNFNDwxIug8q1...
        
         | foobarian wrote:
         | A bit off-color but classic:
         | https://www.youtube.com/watch?v=wgltMtf1JhY
        
       | drojas wrote:
       | My definition: Entropy is a measure of the accumulation of non-
       | reversible energy transfers.
       | 
       | Side note: All reversible energy transfers involve an increase in
       | potential energy. All non-reversible energy transfers involve a
       | decrease in potential energy.
        
         | snarkconjecture wrote:
         | That definition doesn't work well because you can have changes
         | in entropy even if no energy is transferred, e.g. by exchanging
         | some other conserved quantity.
         | 
         | The side note is wrong in letter and spirit; turning potential
         | energy into heat is one way for something to be irreversible,
         | but neither of those statements is true.
         | 
         | For example, consider an iron ball being thrown sideways. It
         | hits a pile of sand and stops. The iron ball is not affected
         | structurally, but its kinetic energy is transferred (almost
         | entirely) to heat energy. If the ball is thrown slightly
         | upwards, potential energy increases but the process is still
         | irreversible.
         | 
         | Also, the changes of potential energy in corresponding parts of
         | two Carnot cycles are directionally the same, even if one is
         | ideal (reversible) and one is not (irreversible).
        
       | ooterness wrote:
       | For information theory, I've always thought of entropy as
       | follows:
       | 
       | "If you had a really smart compression algorithm, how many bits
       | would it take to accurately represent this file?"
       | 
       | i.e., Highly repetitive inputs compress well because they don't
       | have much entropy per bit. Modern compression algorithms are good
       | enough on most data to be used as a reasonable approximation for
       | the true entropy.
        
       | glial wrote:
       | I felt like I finally understood Shannon entropy when I realized
       | that it's a subjective quantity -- a property of the observer,
       | not the observed.
       | 
       | The entropy of a variable X is the amount of information required
       | to drive the observer's uncertainty about the value of X to zero.
       | As a correlate, your uncertainty and mine about the value of the
       | same variable X could be different. This is trivially true, as we
       | could each have received different information that about X. H(X)
       | should be H_{observer}(X), or even better, H_{observer, time}(X).
       | 
       | As clear as Shannon's work is in other respects, he glosses over
       | this.
        
         | JumpCrisscross wrote:
         | > _it 's a subjective quantity -- a property of the observer,
         | not the observed_
         | 
         | Shannon's entropy is a property of the source-channel-receiver
         | system.
        
           | glial wrote:
           | Can you explain this in more detail?
           | 
           | Entropy is calculated as a function of a probability
           | distribution over possible messages or symbols. The sender
           | might have a distribution P over possible symbols, and the
           | receiver might have another distribution Q over possible
           | symbols. Then the "true" distribution over possible symbols
           | might be another distribution yet, call it R. The mismatch
           | between these is what leads to various inefficiencies in
           | coding, decoding, etc [1]. But both P and Q are beliefs about
           | R -- that is, they are properties of observers.
           | 
           | [1] https://en.wikipedia.org/wiki/Kullback-
           | Leibler_divergence#Co...
        
         | rachofsunshine wrote:
         | This doesn't really make entropy itself observer dependent.
         | (Shannon) entropy is a property of a distribution. It's just
         | that when you're measuring different observers' beliefs, you're
         | looking at different distributions (which can have different
         | entropies the same way they can have different means,
         | variances, etc).
        
           | mitthrowaway2 wrote:
           | Entropy is a property of a distribution, but since math does
           | sometimes get applied, we also attach distributions to
           | _things_ (eg. the entropy of a random number generator, the
           | entropy of a gas...). Then when we talk about the entropy of
           | those things, those entropies are indeed subjective, because
           | different subjects will attach different probability
           | distributions to that system depending on their information
           | about that system.
        
             | stergios wrote:
             | "Entropy is a property of matter that measures the degree
             | of randomization or disorder at the microscopic level", at
             | least when considering the second law.
        
               | mitthrowaway2 wrote:
               | Right, but the very interesting thing is it turns out
               | that what's random to me might not be random to you! And
               | the reason that "microscopic" is included is because
               | that's a shorthand for "information you probably don't
               | have about a system, because your eyes aren't that good,
               | or even if they are, your brain ignored the fine details
               | anyway."
        
           | davidmnoll wrote:
           | Right but in chemistry class the way it's taught via Gibbs
           | free energy etc. makes it seem as if it's an intrinsic
           | property.
        
         | dist-epoch wrote:
         | Trivial example: if you know the seed of a pseudo-random number
         | generator, a sequence generated by it has very low entropy.
         | 
         | But if you don't know the seed, the entropy is very high.
        
           | rustcleaner wrote:
           | Theoretically, it's still only the entropy of the sneed-space
           | + time-space it could have been running in, right?
        
         | sva_ wrote:
         | https://archive.is/9vnVq
        
       | niemandhier wrote:
       | My goto source for understanding entropy: http://philsci-
       | archive.pitt.edu/8592/1/EntropyPaperFinal.pdf
        
       | prof-dr-ir wrote:
       | If I would write a book with that title then I would get to the
       | point a bit faster, probably as follows.
       | 
       | Entropy is _just_ a number you can associate with a probability
       | distribution. If the distribution is discrete, so you have a set
       | p_i, i = 1..n, which are each positive and sum to 1, then the
       | definition is:
       | 
       | S = - sum_i p_i log( p_i )
       | 
       | Mathematically we say that entropy is a real-valued function on
       | the space of probability distributions. (Elementary exercises:
       | show that S >= 0 and it is maximized on the uniform
       | distribution.)
       | 
       | That is it. I think there is little need for all the mystery.
        
         | kgwgk wrote:
         | That covers one and a half of the twelve points he discusses.
        
           | prof-dr-ir wrote:
           | Correct! And it took me just one paragraph, not the 18 pages
           | of meandering (and I think confusing) text that it takes the
           | author of the pdf to introduce the same idea.
        
             | kgwgk wrote:
             | You didn't introduce any idea. You said it's "just a
             | number" and wrote down a formula without any explanation or
             | justification.
             | 
             | I concede that it was much shorter though. Well done!
        
               | bdjsiqoocwk wrote:
               | Haha you reminded me of that idea in software engineering
               | that "it's easy to make an algorithm faster if you accept
               | that at times it might output the wrong result; in fact
               | you can make infinitely fast"
        
         | rachofsunshine wrote:
         | The problem is that this doesn't get at many of the intuitive
         | properties of entropy.
         | 
         | A different explanation (based on macro- and micro-states)
         | makes it intuitively obvious why entropy is non-decreasing with
         | time or, with a little more depth, what entropy has to do with
         | temperature.
        
           | prof-dr-ir wrote:
           | The above evidently only suffices as a definition, not as an
           | entire course. My point was just that I don't think any other
           | introduction beats this one, especially for a book with the
           | given title.
           | 
           | In particular it has always been my starting point whenever I
           | introduce (the entropy of) macro- and micro-states in my
           | statistical physics course.
        
         | nabla9 wrote:
         | Everyone who sees that formula can immediately see that it
         | leads to principle of maximum entropy.
         | 
         | Just like everyone seeing Maxwell's equations can immediately
         | see that you can derive the the speed of light classically.
         | 
         | Oh dear. The joy of explaining the little you know.
        
           | prof-dr-ir wrote:
           | As of this moment there are six other top-level comments
           | which each try to define entropy, and frankly they are all
           | wrong, circular, or incomplete. Clearly the very _definition_
           | of entropy is confusing, and the _definition_ is what my
           | comment provides.
           | 
           | I never said that all the other properties of entropy are now
           | immediately visible. Instead I think it is the only universal
           | starting point of any reasonable discussion or course on the
           | subject.
           | 
           | And lastly I am frankly getting discouraged by all the
           | dismissive responses. So this will be my last comment for the
           | day, and I will leave you in the careful hands of, say, the
           | six other people who are obviously so extremely knowledgeable
           | about this topic. /s
        
         | mitthrowaway2 wrote:
         | So the only thing you need to know about entropy is that it's
         | _a real-valued number you can associate with a probability
         | distribution_? And that 's it? I disagree. There are several
         | numbers that can be associated with probability distribution,
         | and entropy is an especially useful one, but to understand why
         | entropy is useful, or why you'd use that function instead of a
         | different one, you'd need to know a few more things than just
         | what you've written here.
        
         | bdjsiqoocwk wrote:
         | > That is it. I think there is little need for all the mystery
         | 
         | You know so little you're not even aware how little you know.
        
         | senderista wrote:
         | Many students will want to know where the minus sign comes
         | from. I like to write the formula instead as S = sum_i p_i log(
         | 1 / p_i ), where (1 / p_i) is the "surprise" (i.e., expected
         | number of trials before first success) associated with a given
         | outcome (or symbol), and we average it over all outcomes (i.e.,
         | weight it by the probability of the outcome). We take the log
         | of the "surprise" because entropy is an extensive quantity, so
         | we want it to be additive.
        
       | eointierney wrote:
       | Ah JCB, how I love your writing, you are always so very generous.
       | 
       | Your This Week's Finds were a hugely enjoyable part of my
       | undergraduate education and beyond.
       | 
       | Thank you again.
        
       | dmn322 wrote:
       | This seems like a great resource for referencing the various
       | definitions. I've tried my hand at developing an intuitive
       | understanding: https://spacechimplives.substack.com/p/observers-
       | and-entropy. TLDR - it's an artifact of the model we're using. In
       | the thermodynamic definition, the energy accounted for in the
       | terms of our model is information. The energy that's not is
       | entropic energy. Hence why it's not "useable" energy, and the
       | process isn't reversible.
        
       | zoenolan wrote:
       | Hawking on the subject
       | 
       | https://youtu.be/wgltMtf1JhY
        
       | bdjsiqoocwk wrote:
       | Hmmm that list of things that contribute to entropy I've noticed
       | omits particles which under "normal circumstances" on earth exist
       | in bound states, for example it doesn't mentions W bosons or
       | gluons. But in some parts of the universe they're not bound but
       | in different state of matter, e.g. quark gluon plasma. I wonder
       | how or if this was taken I to account.
        
       ___________________________________________________________________
       (page generated 2024-07-22 23:02 UTC)