A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
G-Mixup: Graph Data Augmentation for Graph Classification
[article]
2022
arXiv
pre-print
This work develops mixup for graph data. Mixup has shown superiority in improving the generalization and robustness of neural networks by interpolating features and labels between two random samples. ...
Specifically, we first use graphs within the same class to estimate a graphon. ...
of graph neural networks. ...
arXiv:2202.07179v2
fatcat:e3ip5hkzsbghhg6cuzwafekow4
Implicit Graphon Neural Representation
[article]
2023
arXiv
pre-print
In this paper, we propose to directly model graphons using neural networks, obtaining Implicit Graphon Neural Representation (IGNR). ...
Graphons are general and powerful models for generating graphs of varying size. ...
ACKNOWLEDGEMENTS This work is partially supported by NSF under grants CCF-2112665 and CCF-2217058. ...
arXiv:2211.03329v3
fatcat:jtedfzylqzhxhgv5t66siwvxpa
Graph Neural Tangent Kernel: Convergence on Large Graphs
[article]
2023
arXiv
pre-print
We investigate the training dynamics of large-graph GNNs using graph neural tangent kernels (GNTKs) and graphons. ...
Graph neural networks (GNNs) achieve remarkable performance in graph machine learning tasks but can be hard to train on large-graph data, where their learning dynamics are not well understood. ...
More recently, Xia et al. (2022) proposed implicit graphon neural representations, which use neural networks to estimate graphons. ...
arXiv:2301.10808v1
fatcat:yt7mhcs6ffalvfpsrwgsd5bewq
When to Pre-Train Graph Neural Networks? An Answer from Data Generation Perspective!
[article]
2023
arXiv
pre-print
In particular, W2PGNN first fits the pre-training data into graphon bases, each element of graphon basis (i.e., a graphon) identifies a fundamental transferable pattern shared by a collection of pre-training ...
Existing works made great efforts on the issue of what to pre-train and how to pre-train by designing a number of graph pre-training and fine-tuning strategies. ...
When to Pre-Train Graph Neural Networks? An Answer from Data Generation Perspective! ...
arXiv:2303.16458v2
fatcat:mkx4kwmrtbfjpbdwcy5rreofwa
Graphon Pooling in Graph Neural Networks
[article]
2020
arXiv
pre-print
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs. ...
To do so, we consider the graph layers in a GNN as elements of a sequence of graphs that converge to a graphon. ...
Figure 2 : 2 Graphon Neural Network with three layers
Figure 3 : 3 Average training and validation losses for predicting the ratings given by user 1 over 10 different training-test splits. ...
arXiv:2003.01795v1
fatcat:e4ux63xs4nbwti2l3af7rsfucu
Fine-tuning Graph Neural Networks by Preserving Graph Generative Patterns
[article]
2023
arXiv
pre-print
Recently, the paradigm of pre-training and fine-tuning graph neural networks has been intensively studied and applied in a wide range of graph mining tasks. ...
By utilizing a linear combination of these graphon bases, we can efficiently approximate W. ...
Acknowledgments This work is supported by NSFC (No.62322606) and the Fundamental Research Funds for the Central Universities. ...
arXiv:2312.13583v1
fatcat:opyqcznipfdoph2sn2ve63t6my
Training Graph Neural Networks on Growing Stochastic Graphs
[article]
2022
arXiv
pre-print
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data. ...
We propose to grow the size of the graph as we train, and we show that our proposed methodology -- learning by transference -- converges to a neighborhood of a first order stationary point on the graphon ...
Graph Neural Networks are layered architectures, composed of graph convolutions followed by point-wise non-linearities. ...
arXiv:2210.15567v1
fatcat:gtuj2bfg6bbd3onc45zemn7l7q
Graph Neural Networks: Architectures, Stability and Transferability
[article]
2021
arXiv
pre-print
It is also shown that if graphs converge to a limit object, a graphon, GNNs converge to a corresponding limit object, a graphon neural network. ...
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs. ...
Graphon neural networks The graphon neural network (WNN) is defined as the limit architecture of a GNN defined on the graphs of a convergent graph sequence. ...
arXiv:2008.01767v3
fatcat:mhxi2q2rbbd6norzvxifcucncq
Graphon Pooling for Reducing Dimensionality of Signals and Convolutional Operators on Graphs
[article]
2023
arXiv
pre-print
We evaluate our approach with a set of numerical experiments performed on graph neural networks (GNNs) that rely on graphon pooling. ...
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space. ...
Graph neural networks A graph neural network (GNN) is a stacked layered structure -see Fig. 1 . ...
arXiv:2212.08171v2
fatcat:u5okjeeilbezrjdswm3sgkgmf4
Searching for Stage-wise Neural Graphs In the Limit
[article]
2019
arXiv
pre-print
The scaled stage-wise graphs outperform DenseNet and randomly wired Watts-Strogatz networks, indicating the benefits of graphon theory in NAS applications. ...
By utilizing properties of the graphon space and the associated cut-distance metric, we develop theoretically motivated techniques that search for and scale up small-capacity stage-wise graphs found on ...
The optimal graphon, denoted by the filled black dot, is estimated by taking the average of the states visited by SGD. search. ...
arXiv:1912.12860v1
fatcat:66rukyyhj5h3bopgrv7n3hiprq
Transferability Properties of Graph Neural Networks
[article]
2023
arXiv
pre-print
We use graph limits called graphons to define limit objects for graph filters and GNNs -- graphon filters and graphon neural networks (WNNs) -- which we interpret as generative models for graph filters ...
Graph neural networks (GNNs) are composed of layers consisting of graph convolutions and pointwise nonlinearities. ...
Graphon Neural Networks A graphon neural network (WNN) is a deep convolutional architecture consisting of layers where each layer implements a convolutional filterbank followed by a pointwise nonlinearity ...
arXiv:2112.04629v4
fatcat:wahw2jdhnrfcbhysjhn6qb3nci
A Deep Learning Method for Optimal Investment Under Relative Performance Criteria Among Heterogeneous Agents
[article]
2024
arXiv
pre-print
Graphon games have been introduced to study games with many players who interact through a weighted graph of interaction. ...
By passing to the limit, a game with a continuum of players is obtained, in which the interactions are through a graphon. ...
Neural network architecture and training algorithm In this subsection we will discuss in details the simulation algorithm and the neural network architecture. Neural network architecture. ...
arXiv:2402.07365v2
fatcat:j3sfyl223bbzzoxej5zq3iys4y
Size-Invariant Graph Representations for Graph Classification Extrapolations
[article]
2021
arXiv
pre-print
In general, graph representation learning methods assume that the train and test data come from the same distribution. ...
In this work we consider an underexplored area of an otherwise rapidly developing field of graph representation learning: The task of out-of-distribution (OOD) graph classification, where train and test ...
Andrews Fellowship, and the Wabash Heartland Innovation Network. ...
arXiv:2103.05045v2
fatcat:yul2sfauanhjfppdvdqz4vanme
Fixed-Point Centrality for Networks
[article]
2022
arXiv
pre-print
Such a centrality notion is immediately extended to define fixed-point centralities for infinite graphs characterized by graphons. ...
Fixed-point centralities connect with a variety of different models on networks including graph neural networks, static and dynamic games on networks, and Markov decision processes. ...
The graphon versions of graph neural networks as approximations or generalization models of graph neural networks are proposed and analyzed in [32] . ...
arXiv:2209.07070v1
fatcat:arfgoo7revckbeo66awqmndpvq
The Power of Graph Convolutional Networks to Distinguish Random Graph Models: Short Version
[article]
2020
arXiv
pre-print
Graph convolutional networks (GCNs) are a widely used method for graph representation learning. ...
We exhibit an infinite class of graphons that are well-separated in terms of cut distance and are indistinguishable by a GCN with nonlinear activation functions coming from a certain broad class if its ...
This research was partially supported by grants from ARO W911NF-19-1026, ARO W911NF-15-1-0479, and ARO W911NF-14-1-0359 and the Blue Sky Initiative from the College of Engineering at the University of ...
arXiv:2002.05678v1
fatcat:lavqp6bznregxd5pzodgqewigi
« Previous
Showing results 1 — 15 out of 163 results