[HN Gopher] Probability (1963)
       ___________________________________________________________________
        
       Probability (1963)
        
       Author : rcshubhadeep
       Score  : 152 points
       Date   : 2022-10-10 08:46 UTC (14 hours ago)
        
 (HTM) web link (www.feynmanlectures.caltech.edu)
 (TXT) w3m dump (www.feynmanlectures.caltech.edu)
        
       | sillysaurusx wrote:
       | It's worth noting (much to my surprise) that the Feynman lectures
       | on physics weren't written entirely by Feynman. I've often
       | wondered whether the wonderful conversational style I've
       | associated with him is actually him, or one of his
       | contemporaries.
       | 
       | Either way, this is a great chapter on probability. Thanks to
       | whoever wrote it!
        
         | mananaysiempre wrote:
         | There's an unpleasant point in the history of the Feynman
         | lectures: them coming out as "The Feynman Lectures on Physics
         | by Feyman, Leighton, and Sands" is a compromise solution for a
         | dispute where Leighton and Sands wanted to be credited for
         | editing the transcript into readable prose and Feynman
         | considered that work to be purely mechanical and not worth any
         | credit at all[1]. (There is apparently more uncredited work in
         | there as well[2].) "Feynman didn't understand editing is an
         | art" is not much of a headline, but the compromise is still
         | there, in huge letters right on the cover.
         | 
         | [1] https://doi.org/10.1063/1.1955479
         | 
         | [2] https://doi.org/10.1063/PT.6.3.20211209a
        
         | barrenko wrote:
         | Every great man is a Shakespeare.
        
           | sillysaurusx wrote:
           | I'll look terribly uncultured for asking this, but was
           | Shakespeare a collection of individuals? That's very
           | interesting; thanks.
           | 
           | EDIT: Aha: https://en.m.wikipedia.org/wiki/Shakespeare_author
           | ship_quest...
           | 
           | Well, this is a fascinating rabbit hole. Apparently there's
           | some question whether Shakespeare himself was literate, since
           | his parents and daughters seemingly weren't.
        
             | marcosdumay wrote:
             | I guess the GP is more about the idea that Shakespeare was
             | just the first one to write all the folkloric ideas of his
             | time in a format that people loved, instead of that
             | unexplainable genius that created all those interesting
             | stories. (Kinda like Disney. But we don't have the
             | originals anymore.)
             | 
             | That one is a lot better accepted than the idea that he
             | didn't write his works.
        
             | bbarnett wrote:
             | Shakespeare true or not, I have often found that when one
             | excels, others sit on the sidelines, stupefied in
             | disbelief, then shout cries of delusion about the accolades
             | before them.
             | 
             | Hence, Shakespeare cannot be real, for he excels you see...
        
             | ageitgey wrote:
             | Just keep in mind that the Shakespeare authorship
             | conspiracy theory is the is the "the moon landing was fake"
             | of the 1800s. The theory first gained popularly thanks to
             | Delia Bacon in the 1850s, over 200 years after Shakespeare
             | lived.
             | 
             | There's no evidence Shakespeare didn't write his plays and
             | a lot of evidence he did (including multiple books and
             | writings published during this life listing him as author
             | or referring to him as an author).
        
         | darkerside wrote:
         | Have you watched any of his videos? I don't know the exact
         | style you're thinking of, but wonderful and conversational
         | match what I've seen from him.
        
         | stiff wrote:
         | The "Feynman lectures on physics" books are based on actual
         | lectures to Caltech students. You can listen to audio tapes of
         | the lectures here:
         | 
         | https://www.feynmanlectures.caltech.edu/flptapes.html
        
           | sillysaurusx wrote:
           | Thanks! Interestingly, the tape for Probability seems to be
           | narrated by someone other than Feynman.
           | 
           | I should probably give some evidence to back up my claim that
           | Feynman didn't write all of the Lectures, but alas, it's
           | late. I think the credits for the rest of the authors were in
           | the preface, or at the end. I just wish they'd gotten a
           | little more credit.
        
             | stiff wrote:
             | I think these people other than Feynman transcribed and
             | edited the lectures into book form. This seems to have been
             | the process with most (all?) books of his, "QED" and "The
             | character of physical law" were also delivered as lectures
             | and even "Surely You're Joking, Mr. Feynman!" was an
             | interview originally that was later transcribed and edited.
        
             | vmilner wrote:
             | That someone is Matthew Sands:
             | 
             | "Early on, though, a small problem surfaced. Feynman had a
             | long-time commitment to be absent from Caltech the third
             | week of the fall semester, and so would miss two class
             | lectures. The problem was easily solved: I would substitute
             | for him on those days. However, to avoid breaking the
             | continuity of his presentation, I would give the two
             | lectures on subsidiary topics that, although useful to the
             | students, would not be related to his main line of
             | development."
             | 
             | https://physicstoday.scitation.org/doi/10.1063/1.1955479
             | 
             | Later on he writes about the published books:
             | 
             | "The next stumbling block was more serious: choosing a
             | title for the book. Visiting Feynman in his office one day
             | to discuss the subject, I proposed that we adopt a simple
             | name like "Physics" or "Physics One" and suggested that the
             | authors be Feynman, Leighton, and Sands. He didn't
             | particularly like the suggested title and had a rather
             | violent reaction to the proposed authors: "Why should your
             | names be there? You were only doing the work of a
             | stenographer!" I disagreed and pointed out that, without
             | the efforts of Leighton and me, the lectures would never
             | have come to be a book. The disagreement was not
             | immediately resolved. I returned to the discussion some
             | days later and we came up with a compromise: "The Feynman
             | Lectures on Physics by Feynman, Leighton, and Sands."
        
               | triknomeister wrote:
               | That's horrible move from Feynman about credit.
        
               | lupire wrote:
               | It was also a "horrible" move for the two
               | editor/publishers to claim equal author credit for
               | Feynman's much more extensive creative effort.
               | 
               | In reality, a simple negotiation led to a good decision
               | that made everyone happy.
        
               | vmilner wrote:
               | [It's worth emphasising that Sands was hugely positive
               | about the lectures and was great friends with Feynman.]
        
       | wodenokoto wrote:
       | If you scroll to the top there's a camera icon. If you click, you
       | can view photos from the actual lecture
        
       | graycat wrote:
       | Sorry, Feynman, whatever else he did, on probability he gets a
       | grade of flat F. Here is why: In his book _Lectures on Physics_
       | he states that a particle of unknown location has a probability
       | density uniform over all of space. No it doesn 't. No such
       | density can exist. Done. Grade flat F.
       | 
       | I tried to be a physics major but could not swallow all the daily
       | really stupid mistakes such as this one by Feynman I got each
       | physics lecture and I didn't have time both to learn the physics
       | AND to clean up the sloppy math. So, I majored in math.
       | 
       | As I learned the math, from some of the best sources, I came to
       | understand just how just plain awful the math of the physics
       | community is.
       | 
       | Then in one of the Adams lectures on quantum mechanics at MIT I
       | saw some of the reason: The physics community takes pride in
       | doing junk math. They get by with it because they won't take the
       | math seriously anyway, that is, they insist on experimental
       | evidence. So, to them, the math can be just a heuristic, a hint
       | for some guessing.
       | 
       | Students need to be told this, in clear terms, early on.
       | 
       | It went on this way: In one of the lectures from MIT a statement
       | was that the wave functions were differentiable and also
       | continuous. Of COURSE they are continuous -- every differentiable
       | function is continuous.
       | 
       | The lectures made a total mess out of correlation and
       | independence. It looks like Adams does not understand the two or
       | their difference clearly.
       | 
       | There was more really sloppy stuff around Fourier theory. I got
       | my Fourier theory from three of W. Rudin's books. It looks like
       | at MIT they get Fourier theory from a comic book.
       | 
       | I got sick, really sick, of the math in physics. Feynman on
       | probability is just one example.
        
         | jbay808 wrote:
         | > a particle of unknown location has a probability density
         | uniform over all of space. No it doesn't.
         | 
         | In that case, in which locations is the density higher and in
         | which is it lower?
        
         | kgwgk wrote:
         | > he states that a particle of unknown location has a
         | probability density uniform over all of space. No it doesn't.
         | No such density can exist.
         | 
         | Not such a particle can exist then.
         | 
         | You're assuming infinite space though. Did he?
        
           | graycat wrote:
           | Of course, the _space_ was not made explicit. With R the set
           | of real numbers, the usual assumption in the physics is that
           | the math is done in R^3 with the usual inner product, norm,
           | metric, and topology.
        
             | kgwgk wrote:
             | The usual assumption in physics is that the observable
             | universe is bounded.
             | 
             | https://www.feynmanlectures.caltech.edu/I_05.html#Ch5-S6
             | 
             | Edit: you may be talking about the paragraph "For an atom
             | at rest," in
             | https://www.feynmanlectures.caltech.edu/III_07.html#Ch7-S1
             | 
             | The section starts saying that "We want now to talk a
             | little bit about the behavior of probability amplitudes in
             | time. [...] We are always in the difficulty that we can
             | either treat something in a logically rigorous but quite
             | abstract way, or we can do something which is not at all
             | rigorous but which gives us some idea of a real situation--
             | postponing until later a more careful treatment. With
             | regard to energy dependence, we are going to take the
             | second course. We will make a number of statements. We will
             | not try to be rigorous--but will just be telling you things
             | that have been found out, to give you some feeling for the
             | behavior of amplitudes as a function of time. "
        
               | graycat wrote:
               | I put in a qualification:
               | 
               | > ... the usual assumption in the physics is that the
               | math ...
               | 
               | Soooo, you got into the physics. I avoided getting into
               | the physics. E.g., when Newton wrote out his second law
               | or Maxwell wrote out his equations, it was just _math_
               | and the assumption was as I mentioned R^3. Stokes
               | theorem, the Navier-Stokes equations were implicitly in
               | R^3.
        
               | kgwgk wrote:
               | > Soooo, you got into the physics. I avoided getting into
               | the physics.
               | 
               | What did you think that Feynman's Lectures on Physics
               | were about?
        
         | Gatsky wrote:
         | You rage-quit physics, essentially.
        
         | mturmon wrote:
         | It's tough.
         | 
         | I used to work in an area of applied probability where some
         | statistical-mechanics principles were applicable. I'd read
         | papers where authors were making analogies of a large neural
         | network to a stat-mech system, using an applicable stat-mech
         | approximation, and then differentiating that approximation to
         | get a probability bound.
         | 
         | It gave interesting results, and did show you something about
         | the original problem that was hard to get by sticking to the
         | original formalism. But at the end of the day, you really would
         | not bet the farm on the truth of those approximations...
         | 
         | On the other hand, Fourier analysis was originally doubted and
         | scorned by mathematics, but (if I'm remembering the story
         | correctly) ended up being used so much that theory was
         | developed to explain in what sense the Fourier transform
         | approximates the original function.
         | 
         | Another example of the interplay between physics and
         | mathematics is the percolation problem, where there was a kind
         | of archipelago of physics-motivated results that probabilists
         | have been trying to tidy up for decades now. E.g., sec. 1.2 of:
         | https://www.unige.ch/~duminil/publi/2018ICM.pdf
        
         | q-big wrote:
         | > Sorry, Feynman, whatever else he did, on probability he gets
         | a grade of flat F. Here is why: In his book Lectures on Physics
         | he states that a particle of unknown location has a probability
         | density uniform over all of space. No it doesn't. No such
         | density can exist. Done. Grade flat F.
         | 
         | I would rather consider that because it seems that you "need" a
         | uniform distribution for a particle of unknown location, it
         | might makes sense for such applications from physics to weaken
         | the property that a probability measure has to be s-additive to
         | that a probability measure has to be additive. Then it should
         | be possible to define such a "uniform probability 'measure'
         | over all space", perhaps similarly to the example given at
         | 
         | > https://en.wikipedia.org/w/index.php?title=Sigma-
         | additive_se...
        
         | hither_shores wrote:
         | > really stupid mistakes such as this one by Feynman
         | 
         | It's not a mistake, it's a "lie-to-children" fundamentally no
         | different from an intro analysis class talking about "the" real
         | numbers. Freshmen aren't ready for model theory, and they're
         | not ready for rigged Hilbert spaces.
        
         | dchftcs wrote:
         | >probability density uniform over all of space
         | 
         | If space is bounded then such a density can exist.
        
         | clircle wrote:
         | What is the problem with density on space?
        
           | graycat wrote:
           | Take one cubic inch. Let the probability the particle is in
           | that cubic inch be p. Then take integer
           | 
           | n >= 2 / p
           | 
           | cubic inches. Then the probability that the particle is in
           | those n cubic inches is
           | 
           | np >= 2 > 1
           | 
           | greater than 1, a contradiction. Done.
           | 
           | One well considered and informed explanation is that the
           | physics community abuses its students.
        
             | clircle wrote:
             | There's no uniform distribution on an infinite space...
        
             | hackandthink wrote:
             | Probability Theory without Measure Theory is advancing.
             | Interestingly Tobias Fritz is a Physicist.
             | 
             | "A synthetic approach to Markov kernels..."
             | 
             | https://arxiv.org/abs/1908.07021
        
               | dchftcs wrote:
               | graycat's pedantic approach to mathematical formalism is
               | quite defeatist in that it basically disallows any
               | mathematical concept to advance from precise but limited
               | language towards the edge of imagination. The kind that
               | forbids sqrt(2) from existing in the Pythagorean days.
               | 
               | A probability theory that accommodates the concept of
               | "picking a random even number from all integers" can be
               | valuable, isn't compatible with measure theory but is
               | easy to grasp intuitively. When the mathematical tools
               | aren't good enough you still want to be able to reason
               | about concepts, which is why the tools are developed to
               | begin with. Fortunately almost all things are
               | conceptually tractable when dealing with finite space or
               | quantities; if a statement can be transformed to
               | something rigorous (if numerically imprecise) by forcing
               | boundedness it's not too terrible to speak in unbounded
               | terms when the physical world is what you want to model.
               | I can understand why physicists don't want to be burdened
               | with too much about rigor - they can afford the small
               | risk they're wrong sometimes, but can't afford to slow
               | down their search for new discoveries, when so many
               | questions remain unanswered.
        
             | vitus wrote:
             | One counterargument is that space is finite, and so your
             | choice of n is greater than the volume of the universe.
             | (And so sigma-additivity doesn't apply, since your choices
             | of cubic inches are not disjoint.)
             | 
             | But sure, if you're assuming an unbounded space with finite
             | measure, a uniform density across that space must be
             | identically zero everywhere.
        
               | graycat wrote:
               | > identically zero everywhere.
               | 
               | Not a probability density.
        
       | sylware wrote:
       | What's astonishing: many laws of physics emerge from statistical
       | approximations of quantum mechanics.
       | 
       | One day, if I really get into quantum mechanics, I will try to
       | understand how they rebuilt maxwell equations from QED.
        
         | ThomPete wrote:
         | Not if you submit to the Everett interpretation.
        
         | marginalia_nu wrote:
         | Classical behavior emerges from quantum mechanics as you enter
         | the classical domain.
         | 
         | This can be explained through phase decoherence. As temperature
         | rises, random phase shifts are introduced, which effectively
         | removes the quantum effect. You can show mathematically how
         | this works.
         | 
         | Consider the young experiment:
         | 
         | https://imgur.com/FqogDJj
         | 
         | For a plane wave ps ~ e^(ipx/h-iot), the wave function at X is
         | the sum of two components
         | 
         | <X|ps> = <X|P> + <X|Q>
         | 
         | Where for some path-independent normalization function ps(X,t),
         | and using the small angle assumption (QX-PX = 2Xa/L), the
         | components are:                             1  ipXa/hL
         | <X|P>= ps(X,t)- e                    2
         | 1 -ipXa/hL      <X|Q>= ps(X,t) - e                     2
         | 
         | And the probability of finding the particle at X is
         | 2           2    2 pXa      |<X|ps>| = |ps(X,t)|  cos -----
         | hL
         | 
         | That is what you'd expect from the Young experiment. If we
         | introduce a constant phase shift ph between P and Q, you get
         | this average instead:                      2           2    2
         | pXa      |<X|ps>| = |ps(X,t)|  cos  (--- + ph)
         | hL
         | 
         | If this phase shift is instead random, the formula becomes
         | 2  1       ^              pXa                2      |<X|ps>|  -
         | (1 + |  dph P(ph)cos(2 --- + 2ph)) |ps(X,t)|               2
         | v                 hL
         | 
         | Where P(ph) is a probability function for the phase shift. If
         | the probability function is flat, the integral is zero since
         | you're integrating the cosine across its domain. What you get
         | is the classical result!                      2           2
         | |<X|ps>| = |ps(X,t)|
         | 
         | You can even re-phrase random phase shifts into a diffusion
         | equation, and find that given a as the diffusion coefficient
         | 2           2 1       -at   2  pXa      |<X|ps>| = |ps(X,t)|  -
         | (1 + e   cos  (--- + ph) )                          2
         | hL
         | 
         | i.e. the transition behavior from quantum to classical
         | dependent on a direct measure of the decoherence!
         | 
         | a small => quantum result, a = large, classical result.
        
           | UniverseHacker wrote:
           | How did you typeset all of that math into hn?
        
             | marginalia_nu wrote:
             | Painstakingly.
        
         | mhh__ wrote:
         | Read Quantum field theory for the gifted amateur
        
         | programmer_dude wrote:
         | I think you mean "ab initio".
        
           | killjoywashere wrote:
           | QED = Quantum electrodynamics, for which Feynman won the
           | Nobel Prize.
        
             | sylware wrote:
             | Yeah, I meant the latest QED based on the lastest QFT.
        
             | programmer_dude wrote:
             | My bad.
        
         | mfn wrote:
         | Regarding how to derive Maxwell's equations from QED, I'd
         | recommend this lecture:
         | https://theoreticalminimum.com/courses/special-relativity-an...
         | 
         | This derivation is in the context of classical field theory,
         | but QED is only a short hop away through path integrals.
         | 
         | It's quite remarkable how the complexity of Maxwell's equations
         | can be reduced to a single term in the Lagrangian -
         | (F_uv)(F^uv), assuming no charges. That's really it!
        
       | hackandthink wrote:
       | Everybody knows Feynman, who knows Jaynes?
       | 
       | https://quantumfrontiers.com/2018/12/23/chasing-ed-jayness-g...
       | 
       | Jaynes about Probability in Science:
       | 
       | https://www.cambridge.org/gb/academic/subjects/physics/theor...
        
         | mananaysiempre wrote:
         | Jaynes is brilliant, but you ought to take care when reading
         | him. For example, AFAICT his digression on Godel's theorem in
         | _Logic of science_ is complete nonsense, and the rant against
         | Kolmogorov-style axiomatization of infinite probability spaces
         | in same isn't much better.
        
           | kgwgk wrote:
           | "From many years of experience with its applications in
           | hundreds of real problems, our views on the foundations of
           | probability theory have evolved into something quite complex,
           | which cannot be described in any such simplistic terms as
           | 'pro-this' or 'anti-that'. For example, our system of
           | probability could hardly be more different from that of
           | Kolmogorov, in style, philosophy, and purpose. What we
           | consider to be fully half of probability theory as it is
           | needed in current applications - the principles for assigning
           | probabilities by logical analysis of incomplete information -
           | is not present at all in the Kolmogorov system.
           | 
           | "Yet, when all is said and done, we find ourselves, to our
           | own surprise, in agreement with Kolmogorov and in
           | disagreement with his critics, on nearly all technical
           | issues. As noted in Appendix A, each of his axioms turns out
           | to be, for all practical purposes, derivable from the Polya-
           | Cox desiderata of rationality and consistency. In short, we
           | regard our system of probability as not contradicting
           | Kolmogorov's; but rather seeking a deeper logical foundation
           | that permits its extension in the directions that are needed
           | for modern applications."
        
           | hackandthink wrote:
           | I've read in Quantum Information Theory papers, that Jaynes
           | misunderstood Bell (he just didn't get it).
           | 
           | https://physics.stackexchange.com/questions/233203/has-
           | jayne...
        
             | kgwgk wrote:
             | I just found this recent writeup on the subject, I didn't
             | have time to read it yet but looks interesting (he hasn't
             | been active on HN for almost two years, by the way).
             | 
             | https://scottlocklin.wordpress.com/2022/06/06/poking-
             | holes-i...
        
         | vmilner wrote:
         | Well, I do... :-)
         | 
         | Less facetiously, I think Jaynes is becoming better known as
         | Bayesian techniques have become more mainstream.
        
         | UniverseHacker wrote:
         | "rationalists" are obsessed with Jaynes, in as much as HN
         | overlaps with that community I'd say a lot of people on here
         | are familiar
        
         | jqgatsby wrote:
         | The best study guide to Jaynes that I've found is from David
         | Blower (sadly, recently deceased):
         | 
         | Information Processing: The Maximum Entropy Principle
         | https://a.co/d/71tL5bw
         | 
         | He really takes apart the maximum entropy principle in a
         | comprehensible way, to the point where one can see how to apply
         | it to new problems.
         | 
         | (the volumes I and III are also good but not strictly
         | necessary)
        
           | hackandthink wrote:
           | A simple example by Nassim Taleb:
           | 
           | https://www.fooledbyrandomness.com/blog/2021/09/07/estimatin.
           | ..
        
         | akuro wrote:
         | Anybody who loves statistical mechanics has surely heard of
         | Jaynes.
         | 
         | Unfortunately, those who like statistical mechanics seem few
         | and far between. :(
        
           | code_biologist wrote:
           | Reminds me of a classic quote from a stat mech textbook:
           | 
           | "Ludwig Boltzmann, who spent much of his life studying
           | statistical mechanics, died in 1906, by his own hand. Paul
           | Ehrenfest, carrying on the work, died similarly in 1933. Now
           | it is our turn to study statistical mechanics."
           | 
           | David L. Goodstein, States of Matter
        
           | B1FF_PSUVM wrote:
           | Well, who likes having it pointed out that all the air in the
           | room huddling in a corner is a possibility, albeit very, very
           | small ...
        
       | deltasevennine wrote:
       | I have a related question to this topic. Is probability axiomatic
       | to reality? Does it exist on the same level as logic where logic
       | is axiomatic to reality and just assumed to be true? Or is it
       | simply a phenomenon arising from logic?
       | 
       | It seems like probability just happens to work without
       | explanation? Intuitively this seems a bit strange since it feels
       | as though probability should be derived from something else. Not
       | sure if I'm correct here.
       | 
       | What confuses me even more is that I do know logic can be defined
       | in terms of probability. Causal connections can be probabilistic.
       | If A then 25% chance of B and so on.
        
         | quickthrower2 wrote:
         | A lot of what probability is, is the extent of lack of
         | information.
         | 
         | The Monty Hall Problem is a great example of this.
         | 
         | So I would say the more fundamental thing might be information
         | theory.
         | 
         | This my layman view. Not an expert.
        
           | lupire wrote:
           | Probability is weight/count of cases of interest divided by
           | weight/count of all possible cases.
        
             | quickthrower2 wrote:
             | I just rolled a 6 sided die here and looked at the outcome.
             | For you the probability it is a 6 is 1/6. For me is isn't
             | 1/6. Same object!
        
             | mturmon wrote:
             | Perhaps you're just kidding around, but of course that's
             | not good enough for a definition.
             | 
             | It doesn't handle continuous random quantities. It doesn't
             | even handle situations with a discrete outcome but where
             | counting the possible cases isn't well-defined (Buffon's
             | needle being one, but an even better one being the chance
             | of a tossed thumbtack landing point-up). It also doesn't
             | handle cases where symmetry or physics can give the answer,
             | but you can't count cases because they aren't finite or
             | aren't necessarily a-priori equiprobable.
        
         | nyc111 wrote:
         | Probability, seems to me, cannot be fundamental, because, a
         | machine built to flip a coin with always the same pressure, can
         | be adjusted to always give heads, or tails. In such a setup
         | there won't be probability.
         | 
         | From George Boole's The Laws of Thought, p.244: "Probability is
         | expectation founded upon partial knowledge. A perfect
         | acquaintance with _all_ the circumstances affecting the
         | occurence of an event would change expectation into certainty,
         | and leave neither room nor demand for a theory of
         | probabilities."
         | 
         | Can we deduce from this that nature is not probabilistic?
        
           | dariosalvi78 wrote:
           | unfortunately nature IS, most likely, probabilistic:
           | https://en.wikipedia.org/wiki/Hidden-variable_theory
        
           | likeabbas wrote:
           | What if the machine was built with 99% precision?
        
           | notafraudster wrote:
           | It is true that (at least above the level of quantum physics)
           | we tend to believe that reality is deterministic. When the
           | meteorologist gives a chance of rain, in truth if they had
           | perfect forward information on all of the clouds and pressure
           | systems, they would simply declare whether or not there was
           | future rain. In cases like physical phenomenon, you can think
           | of observed uncertainty or chance as being a product of the
           | exact settings of the unmodeled but deterministic factors
           | underlying a particular outcome, or else errors in the
           | functional form of the model with respect to the measure. We
           | tend to assume that unmodeled factors are as-if orthogonal to
           | the causes we are interested in modelling, and thus zero-
           | centered, and our models minimize error predicated on this
           | assumption.
           | 
           | In your proposed flipping model, there are likely to be very
           | small physical imprecisions (vibrations in the flipper, say,
           | drifting tension of some kind of spring or actuator, or small
           | amounts of circulating air, or perhaps tiny imprecisions in
           | the way the coin is loaded into a slot). The machine might
           | always flip heads, but it's still possible to say that
           | whatever arbitrary degree of certainty you need to model the
           | coin's behaviour in the air to achieve 100% accuracy, there
           | could still be arbitrarily smaller error below that
           | threshold, and we'd view this as "randomness" even if it
           | isn't by the laws of physics.
        
           | empyrrhicist wrote:
           | Not a physicist, but your argument is pretty unconvincing in
           | that it relies entirely on intuition about classical physics,
           | ignoring quantum phenomena entirely. If one were to argue
           | that probability were fundamental, they'd very likely start
           | by describing wave functions, which are probabilistic and
           | seem pretty close to fundamental to observable reality.
        
         | kqr wrote:
         | There are two things we mean by "probability". The first is
         | propensity, and this has clear links to information theory, as
         | another person commented about already.
         | 
         | It's important to emphasise that in terms of propensity, it
         | doesn't matter whether or not the event has occurred, what
         | matters is your knowledge about it. A flipped fair coin has a
         | definite side up (as can be verified by a silent third
         | observer) but for you, who has not yet observed which side it
         | is, your best guess is still either side with 50 % probability.
         | 
         | Similarly, if you only know there's a soccer game going on, you
         | might guess that the stronger team will win with 60 %
         | probability (based on historic frequencies of exchangeable
         | situations), but someone who has seen the score and knows the
         | weaker team has a lead and knows there's only a few minutes
         | left of the game will judge there to be a 2 % probability the
         | stronger team wins. Same situation, different information,
         | different judgements.
         | 
         | That's the first meaning of "probability". What we also mean
         | with that word is "the rules of probabilistic calculation".
         | These are based on mathematical ideas like coherency (if one of
         | two things can happen, their probabilities should add up to 100
         | %) and can definitely be taken as axiomatic.
         | 
         | All of this is not an answer to your question, but it might
         | make the discussion richer.
        
           | kgwgk wrote:
           | There are multiple interpretations (corresponding to your
           | first part). One of them is indeed about "propensities" but
           | the most common ones are about "frequencies" and about
           | "uncertainty".
        
         | panda-giddiness wrote:
         | Probability theory can be interpreted as an extension to logic
         | where variables can take fractional values (rather than just be
         | 0/false or 1/true).
         | 
         | E.g.,                 ((A or B) and C) = (A and C) or (B and C)
         | => P[(A or B) and C] = P[(A and C) or (B and C)]           =
         | P[(A and C)] + P[(B and C)] - P[(A and B) and (A and C)]
         | = P[(A and C)] + P[(B and C)] - P[A and B and C]           =
         | P[A|C] P[C]  + P[B|C] P[C]  - P[A and B and C]
         | 
         | Notice the last couple lines -- this is the way in which
         | probability extends logic. In you take the limit where P[A],
         | P[B], P[C] = 0, 1, then the probability statement reduces to
         | the logic statement at the top.
        
         | BOOSTERHIDROGEN wrote:
         | Also in relation to reductionism.
        
         | ThomPete wrote:
         | Probability does not exist in reality as it's open ended. It
         | does however exist in ex a deck of cards and a game that's
         | defined.
         | 
         | Probability is often being misused to say things about reality
         | though. You see that especially in computer simulation whether
         | used in economics, weather etc.
         | 
         | Different initial conditions are put into the models and
         | simulated. And the probability is calculated based on what the
         | majority of those models say.
         | 
         | But those initial conditions are guesses not actual objective
         | explanations. If they were you only needed to run one
         | simulation rather than a range.
         | 
         | A lot of statistics is pure placebo. Purely retrospective.
         | 
         | In reality it either is or it isn't. If you have good
         | explanations like we do in physics you don't need probability.
         | 
         | David Deutsch IMO has the most sane rebuttal of the
         | probability.
         | 
         | http://www.daviddeutsch.org.uk/2014/08/simple-refutation-of-...
        
           | fjkdlsjflkds wrote:
           | > In reality it either is or it isn't. If you have good
           | explanations like we do *in physics you don't need
           | probability*.
           | 
           | So... are you saying Statistical Mechanics (to give an
           | example) is not part of Physics?
           | 
           | In real life, the amount of information you have (and _can
           | have_ ) about a physical process is limited. You can either
           | throw your hands up and say "we can't know for sure", or you
           | can use probabilities to try to get somewhere.
           | 
           | How do you define the position of an electron without using
           | probabilities?
        
             | ThomPete wrote:
             | Just because it's limited doesn't mean I can assign
             | probability to it.
             | 
             | As I said. The difference is closed and open systems.
             | 
             | No problem with probability of getting a given card based
             | on what have already been dealt in ex a game.
             | 
             | The problem arises when it's applied to predicting reality
             | in open ended systems as I said weather, economics, climate
             | etc. and where history is being used as some sort of
             | benchmark of the future.
             | 
             | There is no probability whether an astroid is on the path
             | towards earth. It either is or it isn't.
             | 
             | In other words. We can either explain or not.
        
               | fjkdlsjflkds wrote:
               | So, would you claim that: "There is no probability
               | whether an *electron* is on the path towards earth. It
               | either is or it isn't."? I guess you must know something
               | that Heisenberg didn't.
               | 
               | Good luck defining things such as "path" and "position",
               | without using probabilities, for non-macroscopic objects.
               | 
               | Also, what is your opinion about the classical "double-
               | slit experiment"? Either a particle passes through a
               | slit, or it passes though the other, right?
        
               | kgwgk wrote:
               | There is no probability whether the next card in the deck
               | is an ace. It either is or it isn't.
        
               | ThomPete wrote:
               | There is a probability, it can be described based on what
               | cards have already been dealt. It does not change whether
               | that specific card is an ace or not.
               | 
               | So yes we agree.
        
               | kgwgk wrote:
               | Do we also agree that - even if you don't - other people
               | are able to conceive a probability that the next card in
               | the deck is an ace, a probability that I was born on a
               | Saturday, a probability that Germany wins the World Cup,
               | etc.?
        
               | ThomPete wrote:
               | It's not the discussion whether people can conceive all
               | sorts of things. The discussion is about its relationship
               | to reality.
        
               | kgwgk wrote:
               | I'm pretty sure you can as well!
               | 
               | If you can pick Brazil or Morocco to win $1000 in case
               | the one you choose wins the World Cup which one do you
               | pick? Why?
               | 
               | Are you really indifferent between them because you
               | cannot conceive how saying that one is more likely to win
               | than the other could have any relationship to reality?
        
           | kqr wrote:
           | > In reality it either is or it isn't. If you have good
           | explanations like we do in physics you don't need
           | probability.
           | 
           | You mean good explanations _and_ observations? You 're
           | absolutely right in that if you are able to observe all the
           | relevant information with no noise, you don't need
           | probability. But there are a lot of systems where you can't
           | noiselessly observe what you want - this is where probability
           | is important.
        
             | ThomPete wrote:
             | Yes probability have its place for ex error correction in
             | closed systems.
             | 
             | But you can't predict the future with it. That's what I am
             | trying to get at.
        
               | kqr wrote:
               | So what general method of predicting the future are you
               | using that's better?
               | 
               | Try to avoid the opt-out "I don't predict the future
               | unless I have perfect explanations of everything" because
               | I know you don't -- no-one does.
        
               | ThomPete wrote:
               | I use the same one as you. I conjecture based on my
               | ability to explain why I think x will happen or wont
               | happen.
               | 
               | No amount of putting percentages on X changes X. It
               | either happens or doesn't.
        
               | kqr wrote:
               | So if I asked you whether you found it more likely that
               | you'll ride a helicopter tomorrow, or that a Democrat
               | will be elected president in the next US presidential
               | election, what would be your answer and why?
        
               | ThomPete wrote:
               | I would say "I don't know" and I would be lying if I said
               | anything else.
               | 
               | I can tell you I don't have any plans of going on a
               | helicopter ride tomorrow. I can also tell you that I have
               | no idea if the next president will be democrat.
               | 
               | What formula do you propose we use to calculate the
               | probability?
        
               | kqr wrote:
               | And if I proposed a bet where you pay me $10 now and I
               | pay you $40 if there's a Democrat president next
               | election, you wouldn't take the bet because "you don't
               | know?"
               | 
               | (This situation could be put in a less abstract way:
               | there's a business opportunity that costs some money to
               | realise but you only reap the benefit in the right
               | political climate.)
        
               | ThomPete wrote:
               | I might take the bet but I would just be guessing or I
               | might work towards trying to find a solution to turn the
               | guess into an explanation.
               | 
               | At no time does probability help me with figuring out
               | what the outcome is.
               | 
               | I can either explain what will happen or I can't. There
               | is no 50/50. I just don't have the correct explanation
               | i.e. an explanation that is hard to vary.
        
               | kqr wrote:
               | Can I interpret your taking the bet as an admission that
               | it's a bet that comes with a positive expectation? (In
               | the sense that if you took similar bets very many times,
               | you would end up with an almost sure profit.)
        
           | kgwgk wrote:
           | > If you have good explanations like we do in physics you
           | don't need probability.
           | 
           | Who is we? A lot of physics textbooks have a good amount of
           | probability at their core.
        
             | ThomPete wrote:
             | Yes and most of that is wrong. Physics doesn't deal with
             | probability but with explanations. (Yes also in QM)
        
               | [deleted]
        
               | kgwgk wrote:
               | Explain that to the Royal Swedish Academy of Sciences.
               | Maybe they can still take back this year's prize and give
               | it to some physicists who deserve it.
        
           | mturmon wrote:
           | That David Deutsch web-link is not saying what you think it's
           | saying.
        
       | BOOSTERHIDROGEN wrote:
       | Any statistics book that have similar approach like this ?
        
         | kqr wrote:
         | This was much more common in the first 2/3 of the 20th century
         | than it is today. I can strongly recommend _Theory of
         | Probability_ (de Finetti, compiled 1970 based on work de
         | Finetti did as early as 1930s) and _Foundations of Statistics_
         | (Savage, 1972) - the latter leans a bit on the former but
         | expands on it with useful perspectives.
         | 
         | I recommend you start with these basic theoretical books to get
         | a sense of what it's all built on. But then if you want more
         | practical advice about how to handle things, books on sampling
         | theory tend to hit a sweetspot between theory and practise, in
         | my experience. I like _Sampling Techniques_ (Cochran, 1953) and
         | _Sampling of Populations_ (Levy  & Lemeshow, 2013).
        
           | BOOSTERHIDROGEN wrote:
           | Thanks for the sampling techniques.
        
       ___________________________________________________________________
       (page generated 2022-10-10 23:01 UTC)