[HN Gopher] Hopfield Networks Is All You Need
       ___________________________________________________________________
        
       Hopfield Networks Is All You Need
        
       Author : meiji163
       Score  : 109 points
       Date   : 2021-04-30 05:10 UTC (1 days ago)
        
 (HTM) web link (ml-jku.github.io)
 (TXT) w3m dump (ml-jku.github.io)
        
       | komalghori22 wrote:
       | Amazing
        
       | aparsons wrote:
       | I've seen a lot of efforts to add a notion of associative memory
       | into neural networks. Have any exciting applications of such
       | architectures been publicised?
        
         | SneakyTornado29 wrote:
         | https://arxiv.org/search/cs?searchtype=author&query=Hochreit...
        
         | orange3xchicken wrote:
         | Relevant paper from Misha Belkin's group
         | https://arxiv.org/abs/1909.12362
        
         | truth_ wrote:
         | Just some days ago researchers from Peking U and Microsoft
         | published a paper[0] saying they can access "knowledge neurons"
         | in pretrained embeddings that will enable "fact editing"[1].
         | 
         | [0]: https://arxiv.org/pdf/2104.08696.pdf
         | 
         | [1]: https://medium.com/syncedreview/microsoft-peking-u-
         | researche...
        
         | ilaksh wrote:
         | I thought that Transformers were a type of associative memory.
        
       | ArtWomb wrote:
       | Trending as John Hopfield scheduled to present his "biologically
       | plausible" response to the Modern Hopfield Network at ICLR next
       | week:
       | 
       | Large Associative Memory Problem in Neurobiology and Machine
       | Learning
       | 
       | https://arxiv.org/abs/2008.06996
       | 
       | MHN seem ideal for prediction problems based purely on data, such
       | as chemical reactions and drug discovery:
       | 
       | Modern Hopfield Networks for Few- and Zero-Shot Reaction
       | Prediction
       | 
       | https://arxiv.org/abs/2104.03279
        
         | SpaceManNabs wrote:
         | Krotov (Hopfield's co-author in these set of papers) has a
         | tweetutorial for that paper in your first link
         | 
         | https://twitter.com/DimaKrotov/status/1387770672542269449
        
       | [deleted]
        
       | [deleted]
        
       | SneakyTornado29 wrote:
       | Are*
        
       | [deleted]
        
       | einpoklum wrote:
       | Brief abstract for the lay person (like me):
       | 
       | 1. Hopfield Networks are also known as "associative memory
       | networks", a neural network model developed decades ago by a guy
       | named Hopfield.
       | 
       | 2. It's useful to plug these in somehow as layers in Deep Neural
       | Networks today (particularly, in PyTorch).
       | 
       | I hate non-informative titles!
        
         | isoprophlex wrote:
         | Also... While cute, I found the examples of storing and
         | retrieving images of The Simpsons characters not very
         | informative about what goes on in that weight matrix that
         | stores patterns.
         | 
         | Edit: the linked pytorch implementation looks interesting,
         | these layer types promise pretty incredible things
         | https://github.com/ml-jku/hopfield-layers
        
         | virgil_disgr4ce wrote:
         | Not to mention grammatically incorrect ones :/
        
           | skrebbel wrote:
           | I think it's correct, in the same way that you can say
           | "Rolling Stones is a great band". It's about the tech called
           | "Hopfield Networks", not about any particular number of
           | networks that are all you need.
        
             | caddemon wrote:
             | Not doubting what is officially grammatically correct, but
             | that still sounds really weird to me. Like with sports
             | teams I would only ever say "The Patriots are a good team"
             | or "New England is a good team". Not "The Patriots is a
             | good team".
             | 
             | In any event, the authors definitely chose that title as a
             | callback to the well known paper "Attention is all you
             | need", which introduced Transformers. So that probably
             | influenced their decision to use "is" instead of "are".
        
               | robotresearcher wrote:
               | Consider 'My team is a good team' vs. 'My team are a good
               | team'.
               | 
               | I bet 'is' sounds better to you in this context, though
               | 'my team' and 'The Patriots' are similar noun phrases
               | that could refer to exactly the same thing.
               | 
               | The difference is that Patriots is a plural. Replace it
               | with Manchester United and 'is' sounds good again.
        
               | caddemon wrote:
               | Yeah it's definitely caused by the team name being
               | plural, or at least sounding plural - I've never heard
               | anyone say "The Red Sox is good" either. Regardless of
               | what is technically grammatically correct I think real
               | life usage has pretty much settled on that convention, at
               | least in the US.
        
               | drdeca wrote:
               | Something odd : While "The Red Sox are John's favorite
               | team." seems more natural then "The Red Sox is John's
               | favorite team.", phrasing it in the opposite order,
               | "John's favorite team is the Red Sox." seems more natural
               | then "John's favorite team are the Red Sox." .
               | 
               | This seems like a strange discrepancy. Why is this the
               | case? Maybe it is because "favorite team" is clearly
               | singular, and is closer in the sentence to the "is"/"are"
               | then the plural indicating sound in "Red Sox". Or maybe
               | it is just whichever comes first which determines how the
               | "to be" is conjugated?
               | 
               | Hm, but what if instead of connecting a noun phrase
               | (determiner phrase?) like "The Red Sox" to another noun
               | phrase (determiner phrase) "John's favorite team", we
               | instead connect it to an adjective?
               | 
               | "The Red Sox are singular.", "The Red Sox is singular.",
               | "Singular is The Red Sox." "Singular are The Red Sox." .
               | Well, the "[Adjective] is [noun]" is kind of an unusual
               | thing to say unless one is trying to sound like one is
               | quoting poetry or yoda or something, but to the degree
               | that either of them sound ok, I think "Singular are The
               | Red Sox." sounds better than "Singular is The Red Sox." .
               | Though, in this case, there doesn't seem to be anything
               | grammatically suggested by the adjective that the thing
               | be in the singular case (maybe I shouldn't have used
               | "singular" as the adjective..) .
               | 
               | Hm, what if instead of "John's favorite team [is/are] the
               | Red Sox." , we instead look at "John's favorite [is/are]
               | the Red Sox." ? In this case, it seems, less clear which
               | is more natural? They seem about the same to me (but that
               | might just be me, idk.) .
               | 
               | Anyway : Weird!
        
         | SneakyTornado29 wrote:
         | The title is a reference to the famous machine learning paper
         | "Attention Is All You Need" which introduced the concept of
         | transformers. Transformers have revolutionized how we process
         | sequential data (i.e. natural language processing).
        
           | bonoboTP wrote:
           | Which itself is a reference to the 1967 Beatles song _All You
           | Need is Love_ (which also includes the line  "Love is all you
           | need").
        
       | tediousdemise wrote:
       | Off-topic, but does anyone know what Jekyll theme this is?
       | Absolutely beautiful formatting and color scheme.
        
         | phab wrote:
         | https://pages-themes.github.io/cayman/
        
         | ansk wrote:
         | Further off-topic, but do people actually consider this to be
         | beautiful design? Looks like a rendered markdown document with
         | MathJax and green headers. Perfectly appropriate for the
         | content of the post, but beautiful isn't the first word that
         | comes to mind for me.
        
           | tediousdemise wrote:
           | Beauty is in the eye of the beholder, isn't it? I like the
           | font, as well as the greens, blues, and header gradient.
           | Green is my favorite color.
           | 
           | I also like dark themes (although I wouldn't force those on
           | my viewership).
        
           | mhh__ wrote:
           | I don't think it's awful but I don't like it.
           | 
           | I really wish I could literally just dump LaTeX onto the web
           | and be done with it. Everything I've tried either doesn't
           | work (Pandoc is cute) properly / isn't 1:1, or _does_ work
           | but yields enormous amounts of html (pdf2htmlex).
           | 
           | I am fairly happy with [insert MD->Book tool of your choice],
           | but sometimes I want citations and things like that.
        
           | dmix wrote:
           | No I very much dislike it.
        
       | kdavis wrote:
       | "Sooner or later, everything old is new again." -Steven King
        
         | mhh__ wrote:
         | "I'm fashionable once every 15 years, for about three months" -
         | John Cooper Clarke
        
       ___________________________________________________________________
       (page generated 2021-05-01 23:00 UTC)