[HN Gopher] Spiking Neural Networks
___________________________________________________________________
Spiking Neural Networks
Author : orbifold
Score : 39 points
Date : 2021-12-13 20:31 UTC (2 hours ago)
(HTM) web link (simons.berkeley.edu)
(TXT) w3m dump (simons.berkeley.edu)
| periheli0n wrote:
| Neuromorphic hardware has been usable for 10 years now. Since
| then, algorithms for neuromorphic hardware (i.e. spiking
| networks) always performed 'almost as good' as ANN solutions on
| GPUs (meaning: inferior). But each year a new generation of GPUs
| comes out, using modern processes, with excellent toolchains. In
| a direct comparison of power efficiency, GPUs win over NMHW most
| of the time.
|
| I would love to see Spiking Networks and NMHW take over machine
| learning but it has such a long way to go. And I seriously doubt
| the strategy, followed by most players, to try to beat good old
| ANNs at their own game.
|
| Unless we identify a problem set where event-based computing with
| spikes is the inherently natural solution, I find it hard to
| imagine that spiking networks will ever outcompete ANN solutions.
| a-dub wrote:
| > Unless we identify a problem set where event-based computing
| with spikes is the inherently natural solution, I find it hard
| to imagine that spiking networks will ever outcompete ANN
| solutions.
|
| i'd guess that domain would be real time (unbuffered /
| unbatched) processing of raw sensory data. it seems reasonable
| that biological neural systems evolved for optimal processing
| of sensory information encoded temporally in spike trains, yet
| the few papers on neuromorphic computing i've seen tend to try
| and hammer spiking neural networks into a classic batch based
| machine learning paradigm and then score them against batch
| based anns.
| periheli0n wrote:
| This very much so.
|
| On the other hand, even biology often uses rate codes which
| are inefficient and limited in what they can represent,
| compared to all those timing-sensitive codes, like latency
| codes, rank order codes, phase codes, pattern codes,
| population codes etc.
|
| And when we look at the technical domain, event-based vision
| cameras spew out what could pass as spikes; but even in that
| area spiking networks have proven too limited compared to
| event-based algorithms that were only vaguely bioinspired.
| And this technology took about 20 years from conception to
| making a breakthrough on the market.
|
| So the question is whether spiking networks are indeed the
| future of computation. But without doubt, the concept is very
| interesting academically. A bit like Haskell :D
| a-dub wrote:
| > And when we look at the technical domain, event-based
| vision cameras spew out what could pass as spikes
|
| i have not seen these, i'm curious. do they try and mimic
| early stages of the human visual system? (ie, a mechanical
| v1, with outputs that actually look like the spatial and
| frequency tuning that is often found in v1 neurons?)
|
| edit: <3 wikipedia:
| https://en.wikipedia.org/wiki/Event_camera
|
| > So the question is whether spiking networks are indeed
| the future of computation. But without doubt, the concept
| is very interesting academically. A bit like Haskell :D
|
| or if it will be something we hand code at all... i suspect
| that the future of computation will be derived by the
| machines themselves. if one can use GANs to generate entire
| novel cryptosystems (i read a while back that google was
| doing this), it seems only natural that they could be used
| for finding optimal computational paradigms.
|
| although many would argue that optimal computation is
| computation that is best understood by humans.
| periheli0n wrote:
| > i have not seen these, i'm curious.
|
| The original incarnation goes by the name of Dynamic
| Vision Sensor (DVS), marketed by inivation, an ETH Zurich
| spinoff. Prophesee is another manufacturer with their own
| IP. I think Sony makes event-based camera's too; perhaps
| others as well (Samsung? or was it Huawei?)
|
| They mimic the retina, each pixel emitting events
| ('spikes' if you wish) when luminance crosses a
| threshold. There is no frame clock, each pixel works
| asynchronously. The technology is known for extremely low
| latency, high temporal resolution and ultra-high dynamic
| range. Have a look ;)
| JackFr wrote:
| Makes me think that we can send a man to the moon but we can't
| come up with a practical aircraft powered by flapping wings.
|
| Evolved biological process can be very subtle. They're worth
| studying for their own sake but not necessarily best to solve
| general problems.
| cblconfederate wrote:
| I don't get the appeal of spiking networks. They struggle to
| solve problems that have already readily been solved by ANNs and
| they don't offer much in terms of biological realism - they don't
| account for neuronal geometry and dendritic nonlinear phenomena,
| nor do they explain the protein dependence of LTP.
| periheli0n wrote:
| These are two disparate applications of spiking networks. 1.
| Machine learning--yes, many in the field seem to be trying to
| reinvent ANNs with spikes. Not very useful in my opinion. 2.
| Modeling biological processes. A lot of progress has been made
| in neuroscience research thanks to spiking network models,
| coupled with dendritic computation and all other sorts of
| biological detail. But one would not normally use neuromorphic
| hardware if the end goal is biological realism.
|
| Only if biological realism is required in real time and on a
| constrained power budget, such as on a robot, is Neuromorphic
| hardware the weapon of choice.
| nynx wrote:
| Source: I'm an undergrad doing research in neuromorphic
| computing.
|
| A lot of information about SNNs misses a lot of recent findings
| on the effect of astrocytes on the dynamical state of the
| network. If you look in the recent literature on SNNs, you'll
| find that including astrocyte models in the networks improves
| memory and accuracy significantly in many cases.
|
| It's too bad this article doesn't mention that.
| klowrey wrote:
| Have any links you would recommend?
| orbifold wrote:
| One of my first memorable PhD experiences was getting invited
| to an impromptu meeting in Lithuania, where we met a Finish and
| Lithuanian scientist both working on astrocyte models and
| wondering how that could be applicable to neuromorphic
| hardware. Needless to say this didn't go anywhere, but at least
| we got to sample some really nice Lithuanian food.
| periheli0n wrote:
| Spiking networks are to Machine Learning like Haskell to C++ and
| Python: While not really used to solve many real-world problems,
| they are extremely interesting academically, and important
| concepts have ended up in the mainstream, like event-based
| sensing and control, or event-driven signal processing.
___________________________________________________________________
(page generated 2021-12-13 23:00 UTC)