[HN Gopher] Beyond message passing: A physics-inspired paradigm ...
___________________________________________________________________
Beyond message passing: A physics-inspired paradigm for graph
neural networks
Author : andreyk
Score : 57 points
Date : 2022-05-09 18:04 UTC (4 hours ago)
(HTM) web link (thegradient.pub)
(TXT) w3m dump (thegradient.pub)
| phonebucket wrote:
| Can anyone recommend any arXiv/paper links on the subject for
| someone with reasonable prerequisties (e.g. neural ODEs, physics
| informed neural networks and message passing)? The number of
| references in the article is a bit of an overload! Looks like
| fascinating field.
| ssivark wrote:
| From a quick glance, the blog post seems to be based on the
| following paper involving the same author:
| https://arxiv.org/abs/2106.10934
| andreyk wrote:
| A survey paper is usually a good way to go, such as A
| comprehensive survey on graph neural networks
| (https://arxiv.org/abs/1901.00596) or Graph Neural Networks: A
| Review of Methods and Applications
| (https://arxiv.org/abs/1812.08434)
| albertzeyer wrote:
| Can someone explain the downvotes here? If you think these
| are bad papers, maybe recommend some better ones? Or what is
| wrong with this post?
| hasmanean wrote:
| So when will we have message-passing processor architectures
| again?
| sandGorgon wrote:
| anyone running graph neural networks in production ? what
| framework do you use ?
| dil8 wrote:
| Check out dgl (https://github.com/dmlc/dgl). A lot of papers
| and algorithms are implemented in the examples section.
| melony wrote:
| Are implementations of belief propagation considered message
| passing GNNs?
| andreyk wrote:
| Pretty sure that's the case, yeah
___________________________________________________________________
(page generated 2022-05-09 23:00 UTC)