[HN Gopher] Why is AI hard and physics simple?
___________________________________________________________________
Why is AI hard and physics simple?
Author : bigdict
Score : 42 points
Date : 2021-06-22 19:15 UTC (3 hours ago)
(HTM) web link (arxiv.org)
(TXT) w3m dump (arxiv.org)
| est31 wrote:
| Physics is not simple. e=mc^2 might look nice on a t-shirt but it
| is a cherry picked formula. Newtonian mechanics you learn in high
| school might seem simple to someone with a MINT degree but they
| describe a small part of the world, and complicated things like
| friction are discarded.
|
| General relativity, quantum chromodynamics, etc. They are all
| incredibly complicated.
| kergonath wrote:
| Even bog standard classical mechanics can be very complicated
| and unintuitive, once you go to things like Hamiltonian
| dynamics and many-body problems. Then you have statistical
| Physics and, as you said, relativity and quantum mechanics.
| I've recently spent some time learning about some quantum
| gravity theories; these things are hard.
| xxpor wrote:
| It's probably worth distinguishing between conceptually
| simple and mechanically simple. A many body problem is easy
| to understand (relatively), you're just extending existing
| rules onto more things at once. Now, actually calculating a
| position at a time given some initial vectors? That's
| complicated.
| HWR_14 wrote:
| What is a MINT degree?
|
| And e=mc^2 is not a simple formula. Okay, it's simple. But
| deriving it and understanding why it's like that was what,
| junior year in college?
| Cederfjard wrote:
| I think MINT is the German version of STEM.
| s3r3nity wrote:
| Physicists will tell you that F=ma can get you through
| virtually all of kinematics and dynamics.
|
| I consider that a big chunk of the world.
| Retric wrote:
| Only in terms of spherical cow models. Try to apply pure F=MA
| kinematics to the real world and deformation is one of many
| huge issues it's ignoring.
| AlliedEnvy wrote:
| To make a stab at the question posed by the title:
|
| We've known the principles of Newtonian physics since, well,
| Newton.
|
| We'll need an Isaac-Newton-level of insight into intelligence
| before we can make it so simple.
|
| Nevermind that we can't even seem to agree on the definition of
| intelligence.
|
| As to the definition, I have an objection to calling the current
| big-data stats we do nowadays "machine learning" or "artificial
| intelligence". There is no intelligence there.
|
| Rather, to allude to the article, I consider it more "machine
| muscle memory" or "artificial intuition". The algorithm can tell
| you, "I have a really good hunch about this based on the zillions
| of examples I've seen" but it can't derive underlying truths to
| reason about why.
|
| Perhaps we are recapitulating phylogeny when it comes to
| artificial neural systems? We have something like an autonomous
| nervous system, and a brainstem, but we need so much more to get
| to intelligence.
| streamofdigits wrote:
| for what it is worth: physics is not easy, never was, never will
| be. the painfully slow invention (or is it discovery?) of the
| mathematical machinery required to understand physical phenomena
| is one of the most astonishing feats of the human brain. In my
| view it sets it apart from anything remotely "AI"-sh.
|
| Starting with early calculus, to differential geometry, Hilbert
| spaces and whatnot, the brain doesn't fit models to data, it is
| making up categories and concepts (new classes of metamodels if
| you wish) as it goes along (as improved experimental devices
| augment our sensory inputs). To cope and explain these totally
| indirect information flows the brain conjures up symmetries in
| invisible abstract spaces, invariants and visual imagery from
| alien internal worlds, pursues "thought experiments" to restore
| sanity... Machine learning style fitting of "model" to data is
| just one of the final steps. Important but hardly defining the
| process.
|
| The "AI" folks oversold their goods by a breathtaking factor and
| are now exposed to the entire planet. No physicist will ever be
| able to bail you out :-)
| Koshkin wrote:
| Physics is only _semi-simple._
| PeterWhittaker wrote:
| I don't know where to begin....
|
| _Physics simple_?
|
| _Physical intuition_?
|
| The single most successful program of physics is quantum
| mechanics, and it is neither _intuitive_ nor _simple_.
| Relativity, while conceptually simple, isn 't so simple either
| and is far from intuitive (consider that momentum is conserved
| during a relativistic near-miss collision only if one considers
| the entire collision over a sufficiently length period, since at
| any moment the force vector between the bodies does NOT align
| with separation vector, since the force acting at one moment was
| exchanged, at _c_ , when the bodies were in different positions).
|
| There are a lot of simple concepts in physics, many of them
| basically teaching aids to get people started. When one gets deep
| into the field, simple and intuitive go by the wayside.
| concreteblock wrote:
| Why would you even bother responding based on just the headline
| of the article? The author is not using 'simple' as a synonym
| for 'easy to learn'.
| 6gvONxR4sf7o wrote:
| It's a risk any catchy headline takes. Seems like the same
| property that entices people to click also entices people to
| engage the headline itself.
| dacracot wrote:
| Because it is awful.
| concreteblock wrote:
| Titles have to be short, and as such they can't hope to
| represent the contents of the article completely
| accurately. If you wanted to do that you would have to make
| the title equal to the article's contents.
|
| Based on the parts which I've read so far, a more accurate
| title would be 'Why some currently hot parts of AI not well
| understood, and some parts of Physics well understood?'
|
| I think the original title is an ok approximation of this.
| PeterWhittaker wrote:
| I read TFA, hence the reference to intuition. There is
| nothing in the article that makes a compelling case of
| physics being simple, other than rhetoric.
|
| We forget at our peril the Michelson-Morley Experiment and
| the Ultraviolet Catastrophe, and if we forget these, we may
| assume now too that we have it all figured out.
|
| Of course, active researchers in the subject, both
| theoretical and experimental, do not forget.
| mellosouls wrote:
| TBF it's a terrible title and the overview isn't exactly
| enticing in it's implication:
|
| _Let 's get physicists to look at AI so we might make some
| progress, btw here's a new book that tells us how_
|
| I'm not saying that's what the article is actually about but
| that's what I read from it and it's crass enough that I
| didn't read further.
| slver wrote:
| Physics is simple, is that so... Combine quantum mechanics with
| general relativity then. Einstein couldn't.
| x2dhump wrote:
| some context from TFA:
|
| "Please note that the notion of simplicity to which we are
| appealing is not at all meant to suggest that physics is trivial.
| Instead, we mean it as a compliment: we have the utmost respect
| for the work of physicists. This essay is an apologia for
| physicists doing machine learning qua physicists; it is meant to
| interrogate what it is about the approach or perspective of
| physicists that allows them to reach so far in explaining
| fundamental phenomena and then consider whether that same
| approach could be applied more broadly to understanding
| intelligence"
|
| edit: added quotes
| dacracot wrote:
| Physics is simple? Can you describe to me what gravity is? Have
| you published your unified theory yet?
| onhn wrote:
| The author is talking about how a given physics model appears
| simple when they are presented with it, e.g. a particular quantum
| field theory. This is the kind of limited perspective about
| research that an undergraduate physicist may develop simply by
| solving the hand crafted problems that are presented to them.
|
| However, the true difficulty in physics is arriving at that model
| in the first place. Decades of work offered up against
| experiment, the associated conceptual leaps in understanding
| required to get to e.g. a quantum field theory which succesfully
| predicts things are nothing short of a monumental achievement. To
| say that physics is simple is ludicrous.
| concreteblock wrote:
| You are missing the point the article. Author is not trying to
| argue that AI is 'harder' than physics, like a freshman cs
| major might argue with their physics friends.
|
| Author is talking about how our physical theories, such as QFT,
| currently have more predictive power than any theories we
| currently have about machine learning/deep learning.
|
| (Author has a PhD in theoretical physics).
| onhn wrote:
| I think the article misses the point of what physics is. It
| is not a collection of "sparse" models and principles,
| rather, it is a scientific discipline from which such models
| have emerged.
|
| You will notice the article conflates the two things: physics
| and the known laws of physics (e.g. first para in section
| 1.2). Simplicity of the latter does not imply simplicity of
| the former, but the article assumes that it does in order to
| tackle/state the question as posed: "Why is AI hard and
| physics simple?".
| aeternum wrote:
| While QFT makes some amazingly precise predictions in certain
| areas like the fine structure constant, it is nearly useless
| for predicting even most chemistry.
|
| In practice, the computations required to use the QFT model
| are just too complex for modern computers when it comes to
| single atoms with more than a few protons, not to mention
| larger molecules. Instead, we must use simplified models like
| the Bohr model to make predictions about molecular bonds.
|
| This actually seems to be very similar to AI where we
| understand not everything, but a lot about basic neurons, yet
| the emergent phenomena of intelligence is very difficult to
| predict due to the explosion of computational complexity.
| concreteblock wrote:
| That's a good point. I guess our current mathematics is not
| good enough to say much about the macroscopic behaviour of
| large interacting models.
| Jefff8 wrote:
| Physics has had a 470 year head start, if you measure from
| Copernicus. Of course it looks simple. It wasn't simple at the
| time; it's taken something like 14 generations so far.
| eutectic wrote:
| Physics might be simple, but it's not easy. Especially if you
| want to make predictions for complex systems.
| swagasaurus-rex wrote:
| Physics might be simple, but a physics engine is extremely
| complex.
| banamonster1 wrote:
| ai is top-down and physics is bottom-up.
|
| physics is also mostly deterministic (attach probability
| distributions for stochastic/quantum stuff) and there are well
| defined rules (energy, symmetry, noether, etc).
|
| at the end of the day ai has some space for models and so does
| physics. because physics has well defined rules it's easier to
| apply constraints to that space vs ai/ml where it's informed
| guesswork.
|
| of course there will be a correspondence between parameters in a
| model and emergent physical phenomena ... and i'm sure really
| nice scaling laws, etc will come out , this is just coarse
| graining.
|
| onsager and the likes were onto this stuff way before deep
| learning was a thing. i think this connection is uninteresting
| because optimization in it's heart is physics. dl is just one
| aspect.
|
| - a physicist (escaped to the greener pastures of swe, shame
| really, i miss it but not the wl balance)
| dragonwriter wrote:
| Physics is explaining what is, in a fairly testable domain.
|
| AI is figuring out how to replicate something fuzzily understood,
| from a difficult to test domain, using techniques that are
| largely in a different domain altogether, because even for the
| parts of the inspirational domain we kinda-sorta understand, we
| don't have the tools to directly reproduce them easily so at best
| we simulate them in alternative media.
|
| ("Physics is simple" overstates the case, still, but "AI is hard"
| understates the case, so, relatively speaking, its something of a
| wash.)
| maxwells-daemon wrote:
| Most of the responses here seem to imply that the author doesn't
| understand that physics can be complicated (in the sense of being
| hard to learn or having big equations). He studies theoretical
| physics at MIT [1], so I expect he does.
|
| On the content: it's pretty weird that our best models don't use
| much of the world's underlying structure at all. State-of-the-art
| vision models like vision transformers and MLP-mixer do just fine
| when you shuffle the pixels. You could argue that modern image
| datasets are so big that any relevant structure could be learned
| by attention, but it still feels like we're doing _something_
| wrong when pixel order doesn't matter at all -\\_(tsu)_/-
|
| [1] https://danintheory.com/
| erostrate wrote:
| Vision models need the pixel ordering to match the one they
| have been trained on, in order to work.
|
| They won't generalize after training to transformations of the
| data that they haven't been trained on, even simple ones such
| as rotations, whereas humans will.
|
| So I would argue that vision model do use the "underlying
| structure", and even that one of their problems is that they
| make use of some of the "underlying structures" that are not
| actually important, such as image luminosity, rotations etc. I
| think people usually augments the data with these
| transformations beforehand during preprocessing to enforce
| invariance.
| [deleted]
| whatshisface wrote:
| All human endeavors will converge on equal difficulty because
| it's only limited by the capability of the people doing it.
| rexreed wrote:
| This was written to promote a book [0]
|
| "As a first step in that direction, we discuss an upcoming book
| on the principles of deep learning theory that attempts to
| realize this approach.
|
| Comments: written for a special issue of Machine Learning:
| Science and Technology as an invited perspective piece"
|
| So take it for what it's worth.
|
| [0] https://deeplearningtheory.com/PDLT.pdf
| de_Selby wrote:
| Ah, well that explains the clickbaity title, which is all most
| comments here are discussing at face value.
| dekhn wrote:
| Author has it completely backward. [edit: on further reading, the
| title is clickbait but the article content is consistent with my
| point below)
|
| With only a few exceptions, ML is incredibly simple (there is no
| AI). The math is simple, the mechanics of evaluating it is
| simple, the reason it works is simple, and it only really works
| well if you have absurd amounts of data and CPU time.
|
| Physics is... determining the mathematics you need to know on the
| fly while discovering and explaining many phenomena. You can
| spend decades focusing on matrix multiplications and other fairly
| straightforward trivia to analyze your particle trajectories, and
| then suddenly, you need to know group theory or some completely
| different field of math just to understand the basic modelling.
|
| Personally I think the most impressive thing in physics and stats
| so far is our ability to predict the trajectories of solar system
| objects far into the future. After some very serious numerical
| analysis over the past 50 years, we've reached the point where
| there aren't many improvements we can make, and most of them come
| from identifying new objects, their position, and mass
| (data/parameters), and the real argument is about whether the
| underlying behavior is truly unpredictable even if you have
| perfect information.
|
| Of course, the last best work in this area was done by Sussman
| who has been an ML researcher for some time:
| (https://www.researchgate.net/publication/6039194_Chaotic_Evo...)
|
| As you can see, physicists pretty much invented all the math to
| do ML whilst solving _other_ problems along the way:
| https://en.wikipedia.org/wiki/Stability_of_the_Solar_System
|
| in fact many of my friends who were physics people, when I show
| them the code of a large scale batch training system they wonder
| why they did physics instead of CS because the math is so
| unbelivably simple compared ot the tensor path integrals they had
| to learn in Senior Physics.
| banamonster1 wrote:
| people are missing the point
|
| the author is talking about constraint optimization in physics
| vs ml/ai
|
| constraint optimization is easier with a well defined prior vs
| the guess work in ml algos.
|
| dl models in physics can scale and emergent phenomena will
| depend on scaling parameters -- this is coarse graining in
| physics.
| dekhn wrote:
| I didn't miss the point. my graduate work was using
| constraint optimizers in molecular dynamics (n-body physics)
| and it translated to ML (I didn't have to relearn anything).
| The one part that was truly simpler is the objective
| functions in ML are believable to be convex, while n-body
| physics with constraints are highly non-convex).
| banamonster1 wrote:
| people on the board, not you
|
| > my graduate work was using constraint optimizers in
| molecular dynamics
|
| me too for qm/mm sims -- did some rg/complex systems work
| too ;)
| concreteblock wrote:
| The title is 'clickbait' for sure, but not neccessarily
| incorrect. After all 'simple' can mean different things to
| different people. And the author clarifies what he means by
| 'simple' in the rest of the article.
| acdha wrote:
| It's incorrect, and everyone knows why a provocative title is
| used even if the author has to spend the rest of the paper
| walking it back.
| gfiorav wrote:
| When newton defined his derivative, it took two pages. Any
| textbook nowawdays can do that in a paragraph.
|
| AI is not yet common knowledge. It isn't as well understood.
|
| But you just wait.
| meiji163 wrote:
| They'll put anything on hep-th nowadays
| sidlls wrote:
| Physics isn't simple. It took (literally) thousands of years of
| study by very smart people to get to the point where what we call
| "intuition" about the physical world is what it is. And, as any
| physicist who paid attention in class will tell you, even _that_
| intuition isn 't really right.
|
| Any simplicity observed in physics is born of long familiarity,
| or is the result of underlying complexities being masked or
| approximated away.
| danbruc wrote:
| Physics is simple in a certain sense and the paper explains
| this. In theory the force acting on something you drop could be
| a function of the state of every particle in the universe and
| each one could contribute with a different weight. But this is
| not the case, things are only influenced by things nearby and
| the weights involved are not arbitrary, the charge of each
| electron, for example, is the same. In that sense physics is
| unbelievable simple as compared to what it could be.
| dekhn wrote:
| nobody has proved the conjecture "things are only influenced
| by things nearby" this is one of the largest ongoing
| arguments in QM (locality: https://web.mit.edu/asf/www/Press/
| Holmes_Physics_World_2017....)
|
| We also don't know for sure the "constants" are constant
| throughout the universe (spatially and temporally) This is
| assumed for now, and seems almost certainly true.
|
| I think it's unsafe to assume the above are Absolutely True
| and that that's why physics is simple.
| danbruc wrote:
| _nobody has proved the conjecture "things are only
| influenced by things nearby" this is one of the largest
| ongoing arguments in QM_
|
| This is a much more nuanced matter. Non-locality as in
| global wave function collapse or Bohmian mechanics has not
| the same consequences as classical non-locality, there is
| no causal influence from things you do to an entangled
| particle at the other end of the universe. Also to entangle
| particles they have to first interact locally before they
| can be separated.
|
| _We also don 't know for sure the "constants" are constant
| throughout the universe (spatially and temporally) This is
| assumed for now, and seems almost certainly true._
|
| This does not really change the argument, even if, for
| example, the fine-structure constant is not constant after
| all, then there will most likely be a hand full of other
| constants that describe how it varies over space and time.
| This is very different from every electron having a unique
| electric charge that is not governed by anything and the
| only way to figure it out is to measure it for each
| electron.
|
| It would also probably make a difference how values are
| distributed, are the charges of the electrons nicely
| distributed and vary by a factor of two or ten? Some
| statistical theory could probably deal with that. But what
| if there is no nice distribution, if values vary by
| hundreds of orders of magnitude, if the expectation value
| or the variance of the charge is infinite? I am certainly
| unqualified to make any definitive statements but I can
| imagine physics to be weird to the point that it becomes
| mathematically intractable or at least only produces
| useless answers because of the amount of uncertainty that
| enters the equations.
|
| In any case, we know that the universe is simple to a very
| good approximation, even if fundamentally everything
| depends on everything and the electron charges are all
| random, those effects are small and we can have a good
| approximation with simple theories.
| oivey wrote:
| As someone with a physics degree, physicists really have a unique
| ability to stroke their own egos. I think I agree with the
| premise - structure is certainly the most important thing in our
| physics and machine learning. However, a significant portion of
| the effectiveness behind ML is letting the computer find the
| structure for itself rather than doing it yourself. The most
| effective uses guide the learning via minimal structure.
| Quekid5 wrote:
| PSA: Anyone can publish just about anything on arxiv.org. Doesn't
| mean that it has any merit whatsoever.
| whatshisface wrote:
| Not anyone, you have to get vouched by someone who can publish
| there. Look at viXra for an example of a preprint server truly
| anyone can submit to.
|
| https://vixra.org/
___________________________________________________________________
(page generated 2021-06-22 23:01 UTC)