[HN Gopher] Critical brain hypothesis: A physical theory for whe...
___________________________________________________________________
Critical brain hypothesis: A physical theory for when the brain
performs best
Author : blegh
Score : 63 points
Date : 2023-01-31 19:58 UTC (3 hours ago)
(HTM) web link (www.quantamagazine.org)
(TXT) w3m dump (www.quantamagazine.org)
| amelius wrote:
| "The critical brain hypothesis suggests that neural networks do
| their best work when connections are not too weak or too strong."
|
| Isn't this just about as obvious as the fact that traffic flows
| best when traffic lights are neither always red nor always green?
| 0xcafefood wrote:
| "The critical brain hypothesis suggests that neural networks do
| their best work when connections are not too weak or too
| strong." is actually a tautology, no?
|
| Without a crisp explanation for what "too weak" or "too strong"
| mean, this is just saying "Neural networks work best when
| connections couldn't be changed to make them work better."
| TchoBeer wrote:
| It's not a tautology, because it isn't clear that there is
| some threshold beyond which connections are too weak or too
| strong. I might think that more connections are more good,
| for instance.
| 0xcafefood wrote:
| So it might not be possible to reach a state with
| connections that are "too strong" but logically if you
| could you'd have to define them by suboptimal performance.
| actually_a_dog wrote:
| Not really. Signals have a finite power level. If you open all
| the lanes all the time, you'll get a very attenuated signal
| throughout the entire network. If some connections are stronger
| than others, that's when you can actually see interesting
| behavior.
| karmakurtisaani wrote:
| I had a similar issue with the article. Essentially the
| information content seems to boil down to "there is a state
| where the brain works the best". For experts there is probably
| a lot to learn from the technicalities of this research, but
| the article leaves a layman a bit cold.
| anonymousDan wrote:
| To me the fact that more information is transmitted with an
| intermediate number of connections than with a strongly
| connected network wasn't immediately obvious at first glance. I
| guess there is a link to entropy, i.e. how surprised can you be
| by the information received at one end of the network given its
| connectivity.
| asplake wrote:
| That makes it sound like optimising. To my not very great
| understanding, I think it's more like keeping things right on
| the edge
| tgv wrote:
| If you look for something in a complex system, and you look hard
| enough, you're probably going to find it. The example of epilepsy
| might just be seeing certain behavior through the lens of the
| theory. Unfortunately, the article fails to give us any hard
| definition of criticality.
| mach1ne wrote:
| Amen. I think the generosity of complex systems for different
| interpretations is the bane of comprehensive model for
| neuroscience.
| quantum_mcts wrote:
| "The Principles of Deep Learning" paper
| https://arxiv.org/abs/2106.10165 has a rather rigorous (based on
| Quantum Field Theory (QFT) mathematical apparatus) analysis of
| modern deep learning with the similar insight. They suggest that
| the learning happens in the critical regimes. And use running
| couplings, renormalization group and other fancy OFT math to
| derive some insights in the DL field. Here's a HN thread, by the
| way, https://news.ycombinator.com/item?id=31051540.
| DecayingOrganic wrote:
| Any idea on how this would affect learning with a spaced
| repetition software? Perhaps, the practice of excessive recalling
| with, say, Anki could essentially be detrimental to learning in
| some aspects? As it would make certain connections in a neural
| network unnaturally strong and cause saturation and
| overactivation in the last layer.
| gardenfelder wrote:
| Stuart Kauffman explored this idea years ago with his NK Theory
| [1]
|
| [1] https://en.wikipedia.org/wiki/NK_model
| varjag wrote:
| Wonder if there's a connection with Ballmer Peak.
| vermilingua wrote:
| I doubt Ballmer Peak needs to reach as far down as neuroscience
| to find a basis: developers tend to overthink, and alcohol de-
| thinks.
| alexpotato wrote:
| The book Drunk spends a lot of time on this very point:
|
| From a long term health perspective, alcohol is very bad or
| even toxic for humans. That being said, it's been used
| throughout history for short term benefits ranging from
| creativity to facilitating social interactions. On the
| creativity side, the author explicitly states how drinking
| alcohol can help adults activate child like curiosity and
| thinking. The idea being that the adult mind has too many
| inhibitions and the alcohol helps lower those for both
| internal and external scenarios.
|
| 0 - https://www.amazon.com/Drunk-Sipped-Danced-Stumbled-
| Civiliza...
| revskill wrote:
| To me, it's always after sleep.
___________________________________________________________________
(page generated 2023-01-31 23:00 UTC)