[HN Gopher] Topological Deep Learning: A Survey on Topological N...
___________________________________________________________________
Topological Deep Learning: A Survey on Topological Neural Networks
Author : dil8
Score : 100 points
Date : 2023-04-23 22:46 UTC (1 days ago)
(HTM) web link (arxiv.org)
(TXT) w3m dump (arxiv.org)
| liorben-david wrote:
| The paper seems to be using homology as the topological feature
| here. I've done some work in Topological Data Analysis before and
| it feels like the hidden issue is that computing homology is
| generally very inefficient(Since it usually amounts to reducing
| an nxn matrix).
|
| It definitely feels like graphs/topology should be helpful tools
| to work with data(Since graph-like structures are good
| representations of the real world), but we need to solve this
| efficiency issue before this can be possible.
|
| Also to address the confusion on how category theory comes into
| it, category theory studies abstract structures where you have
| objects and relationships between these objects. A lot of
| algebraic topology(Which is the sort of topology relevant here)
| is built in the language of category theory(Either by neccesity
| or by convention).
| chaxor wrote:
| I think most people in the field recognize this efficiency
| issue and focus on very small NNs. As of yet I have seen very
| little promising work on improving efficiency, other than work
| outside of NNs, e.g. Eirene for PH (which is actually a mess
| under the hood - someone needs to clean up the code base last
| time I checked - 9 deep nested for loops, really?). There are
| several options for topological NN layers out there now, but
| all of them are excruciatingly slow and blow up RAM (CPU run
| NNs, since typically >500 GB for fairly small problems) very
| quickly when I have tried them out.
| liorben-david wrote:
| Completely Agree. Part of the issue feels like that the
| people working on this are far more on the Math side than on
| the CS side. Like Gunnar Carlson's textbook and papers about
| Persistent Homology are geared towards algebraic topologists
| when there is probably a way to explain Persistent Homology
| that does not rely on category theory, polynomial rings, etc
| lmcinnes wrote:
| For suitable specialized cases thins can be quite efficient.
| For persistent H_0 of VR-complexes in low-dimensional space
| there is an O(N log(N)) algorithm for N data points; that's
| decently fast. If you want H_1 I believe (but cannot prove)
| that there exists an O(N^2 log(N)) algorithm. Beyond H_1
| things get painful. Since most PH software is written for
| general cases they don't tend to avail themselves of these
| special case shortcuts. Given that H_0 and H_1 of VR-
| complexes of low dimensional data covers a vast amount of the
| use cases I think specialized code for this would be
| worthwhile.
| dpflan wrote:
| I am not well versed in topological deep learning; but can
| topological concepts be learned by a graph neural network -- like
| can there a be topological layer(s) that learns the topology of
| the graphs, like how it's easy to visualize how a CNN
| architecture can see different hierarchical patterns at deep
| depths, seemingly learning the greater complexities of a image
| representation and understanding?
| chaxor wrote:
| Yeah this is being done by several groups across the field
| Check out the work by Gunnar Carlsson and Bastien Reick. If
| those are enjoyable also check out Micheal Bronstein and Petar
| Velolickovic, although not directly related to your question.
| mplemay wrote:
| Do you mind linking the homepage, google scholar, etc. of
| "Petar Velolickovic"? The only reference to the name on
| google search appears to be this post.
| dpflan wrote:
| - https://geometricdeeplearning.com/ -- group of
| researchers presenting the paradigm
|
| - https://petar-v.com/ - Petar's homepage
|
| - https://scholar.google.co.uk/citations?user=kcTK_FAAAAAJ&
| hl=... - Petar's Google Scholar
|
| - https://www.linkedin.com/in/petarvelickovic/ - Petar's LI
| dpflan wrote:
| GAT is particularly interesting at the moment: https://sc
| holar.google.co.uk/citations?view_op=view_citation...
| zwaps wrote:
| The proposed use cases (folding, social networks) correspond to
| what was previously termed "Geometric Deep Learning". Here,
| things like Graph Neural Networks were understood in terms of
| (pairwise?) relations to be learned under the assumption of
| symmetry/equivariance. It can be shown that not all relations
| that fit in a graph can be learned by such GNNs. Further, not
| all relations can be modeled with a (symmetric) graph.
|
| Hence, these people moved on the using Category Theory, which
| may or may not lead to the use of GNNs.
|
| Reading the first part of the present paper, the Topological DL
| would instead move beyond the idea of "pairwise relation".
| dpflan wrote:
| Thank you. "Reading the first part of the present paper, the
| Topological DL would instead move beyond the idea of
| "pairwise relation". -- interesting. It seems to make sense
| intuitively, given difficulties in GNNs, to look at a higher
| order representations -- which may lead to understanding the
| underlying graph properties too. But I had assumed that such
| higher orders of graph would be part of the GNN learning
| process, even representations that are not really "human
| mathematically understood yet".
|
| That is also interesting me regarding Geometric Deep
| Learning, which got some hype and interest recently, and
| seemed like a good start for more formal representation of
| different deep learning models (finding connections and
| mathematical steps between the model zoo). Something more
| mathematically rigorous does seem needed to truly make
| informed engineering improvements and scientific
| understanding.
| zwaps wrote:
| I agree that it's exciting to move toward a mathematical
| approach to characterize what can, and what can not be
| learned by a DNN!
|
| Maybe someone else has the knowledge to give you some more
| precise answers.
|
| As far as I understand we need to differentiate between
| higher order relations that can be learned by message
| parsing and those that cannot. Some higher order
| relationships can be represented in a graph (of pairwise
| relations) and, furthermore, can be learned by iterating
| out from a focal node through the paths (think centrality).
| GNNs can learn those.
|
| But GNNs can't learn all Graphs because not all such
| relations are amendable to message parsing. So, not all
| higher order information contained in a graph are learned
| by GNNs.
|
| In the present case, the authors seem to make the point
| that there are higher-order relations (say, node-set) which
| are also not representable by a graph (or a collection of
| graphs).
|
| I actually do not know whether that is obviously true. For
| instance, hypergraphs can be modeled as a collection of
| graphs. But if the claim stands, then we'd need topological
| deep learning to move further. Otherwise, like you say, it
| is maybe a more general framework to understand information
| processing in DL - or more efficient to learn node-node or
| node-set information.
|
| Either way, I have not yet quite figured out how Category
| Theory and Topological DL relate to each other (each
| generalising Geometric Deep Learning).
| lmeyerov wrote:
| Agreed, the definitions here of what a GNN is (=> cannot
| do) seem pedantically narrow & extreme, and then throws
| the baby out w the bath water
|
| See last ~2 years of work by Michael Bronstein's teams
| (who moonlights as Twitter's ex? GNN r&d lead) that
| address 2+ of the points here
|
| It's true they are not the end-all, but our issues in
| practice aren't the above, but more like how to best
| combine temporal deep learning techniques w graph ones
| (it is not just another continuous dimension)
|
| Worth noting: TDA methods & UMAP are largely the same
| under-the-hood & in practice, and we find UMAP (incl
| neural) _highly_ effective, and typically reach for it
| before GNNs. UMAP has been an admirably rare mix of
| theory & practicality..
| dpflan wrote:
| Interesting. Could one graph be the topological
| representation, let's say like SCC? So there are
| heterogenous graphs with meta-graphs connected to
| them...?
___________________________________________________________________
(page generated 2023-04-24 23:01 UTC)