A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Improving Fuzzy Multilevel Graph Embedding through Feature Selection Technique
[chapter]
2012
Lecture Notes in Computer Science
Feature selection permits FMGE to select the subset of most discriminating features and to discard the confusing ones for underlying graph dataset. ...
In this paper we take forward our work on explicit graph embedding and present an improvement to our earlier proposed method, named "fuzzy multilevel graph embedding -FMGE", through feature selection technique ...
Feature Selection by Ranking Discriminatory Features The feature vector obtained by FMGE is based on histogram encoding of the multilevel information extracted from graph. ...
doi:10.1007/978-3-642-34166-3_27
fatcat:zxl3fcf775c2hm23vurc3yfqd4
Little Ball of Fur: A Python Library for Graph Sampling
[article]
2020
arXiv
pre-print
Experiments demonstrate that Little Ball of Fur can speed up node and whole graph embedding techniques considerably with mildly deteriorating the predictive value of distilled features. ...
We show the practical usability of the library by estimating various global statistics of social networks and web graphs. ...
Using these embeddings as input features for node and graph classification tasks we establish that the embeddings learned on the subsampled graphs extract high quality features. Our contributions. ...
arXiv:2006.04311v2
fatcat:acolhpjfefbvpgwcuklm3c2zna
Little Ball of Fur
2020
Proceedings of the 29th ACM International Conference on Information & Knowledge Management
Experiments demonstrate that Little Ball of Fur can speed up node and whole graph embedding techniques considerably with mildly deteriorating the predictive value of distilled features. ...
We show the practical usability of the library by estimating various global statistics of social networks and web graphs. ...
Using these embeddings as input features for node and graph classification tasks we establish that the embeddings learned on the subsampled graphs extract high quality features. Our contributions. ...
doi:10.1145/3340531.3412758
dblp:conf/cikm/RozemberczkiKS20a
fatcat:sqxrjqzyxzbufanmg5fyp6cxpu
Graph Representation Learning for Wireless Communications
[article]
2022
arXiv
pre-print
Potential of graph representation learning in wireless networks is presented via few exemplary use cases and some initial results on the GNN-based access point selection for cell-free massive MIMO systems ...
An overview of graph learning is provided which covers the fundamentals and concepts such as feature design over graphs, GNNs, and their design principles. ...
To obtain nodes features (statistics) of a graph, two general methods can be used : 1) importance-based features such as node degree or various node centrality measures, 2) structure-based features such ...
arXiv:2212.01904v1
fatcat:35rcd7gcibd6joewgje63nffvm
MotifExplainer: a Motif-based Graph Neural Network Explainer
[article]
2023
arXiv
pre-print
Our proposed motif-based methods can provide better human-understandable explanations than methods based on nodes, edges, and regular subgraphs. ...
We consider the explanation problem of Graph Neural Networks (GNNs). ...
The labels of graphs are assigned based on the associated motifs. All node features are initialized as vectors with all 1s. BA-Shapes [6] is a synthetic node classification dataset. ...
arXiv:2202.00519v2
fatcat:ijhwqh7mmzekpf4rnl44dh7egm
AutoGML: Fast Automatic Model Selection for Graph Machine Learning
[article]
2022
arXiv
pre-print
To capture the similarity across graphs from different domains, we introduce specialized meta-graph features that quantify the structural characteristics of a graph. ...
In this work, we develop the first meta-learning approach for automatic graph machine learning, called AutoGML, which utilizes the prior performances of existing methods on a wide variety of benchmark ...
Table 7 : Summary of the global statistical functions Σ for deriving a set of meta-graph features from a graph invariant (e.g., k-core numbers, node degrees, and so on). ...
arXiv:2206.09280v2
fatcat:xthb344hfvbe3mlhlasnf5fllu
A Comparison of Explicit and Implicit Graph Embedding Methods for Pattern Recognition
[chapter]
2013
Lecture Notes in Computer Science
Our preliminary experimentation on different chemoinformatics datasets illustrates that the two implicit and three explicit graph embedding approaches obtain competitive performance for the problem of ...
In recent years graph embedding has emerged as a promising solution for enabling the expressive, convenient, powerful but computational expensive graph based representations to benefit from mature, less ...
Method 3: Attribute Statistics based Embedding The attribute statistics based embedding of graphs is a simple and efficient way of expressing the labelling information stored in nodes and edges of graphs ...
doi:10.1007/978-3-642-38221-5_9
fatcat:pzp6r7hjgjbwrmbykpjapbz6t4
The Application of Graph Embedding Based on Random Walk
2022
Highlights in Science Engineering and Technology
Existing research on random walk-based graph embedding methods is very rich. ...
DeepWalk model, Node2Vec model, HARP model are three graph embedding models based on the classical random walk model. Calculations for different data can occur by generating different node sequences. ...
Embedding DeepWalk for both small graphs and large graphs can be done well. This is based on the condition of unweighted graphs.
Table 1 . Comparison based on the classic random walk model 1 . ...
doi:10.54097/hset.v16i.2624
fatcat:6vilemy36vc75kmjavpscqow3a
A Knowledge Graph Entity Disambiguation Method Based on Entity-Relationship Embedding and Graph Structure Embedding
2021
Computational Intelligence and Neuroscience
To improve the Precision and Recall of entity disambiguation problems, we propose the EDEGE (Entity Disambiguation based on Entity and Graph Embedding) method, which utilizes the semantic embedding vector ...
of entity relationship and the embedding vector of subgraph structure feature. ...
Entity Disambiguation Based on Graph Neural Network. e final entity disambiguation is based on the similarity measurement of the entity embedding vector V KGE and the entity structure embedding vector ...
doi:10.1155/2021/2878189
pmid:34603428
pmcid:PMC8486511
fatcat:yyh34x6zfbhgdntxzurlpgwdki
Using ontology embeddings for structural inductive bias in gene expression data analysis
[article]
2020
arXiv
pre-print
Stratifying cancer patients based on their gene expression levels allows improving diagnosis, survival analysis and treatment planning. ...
We use ontology embeddings that capture the semantic similarities between the genes to direct a Graph Convolutional Network, and therefore sparsify the network connections. ...
Moreover, the ontology-based feature selection allows selecting a biologically relevant set of features. ...
arXiv:2011.10998v1
fatcat:px2whb62q5apjesojodgjq7hsu
Explainability-based Backdoor Attacks Against Graph Neural Networks
[article]
2021
arXiv
pre-print
For instance, on the node classification task, the backdoor attack with trigger injecting position selected by GraphLIME reaches over 84 % attack success rate with less than 2.5 % accuracy drop ...
There are already numerous works on backdoor attacks on neural networks, but only a few works consider graph neural networks (GNNs). ...
node classification task based on different trigger features selecting strategies. ...
arXiv:2104.03674v2
fatcat:iip3rirgtbbtpljsk4d24vdanu
Comparative effectiveness of medical concept embedding for feature engineering in phenotyping
2021
JAMIA Open
Of MCEs based on knowledge graphs and EHR data, MCEs learned by using node2vec with knowledge graphs and MCEs learned by using GloVe with EHR data outperforms other MCEs, respectively. ...
Properly learned medical concept embeddings (MCEs) capture the semantics of medical concepts, thus are useful for retrieving relevant medical features in phenotyping tasks. ...
of the nodes in a graph using random walk. ...
doi:10.1093/jamiaopen/ooab028
pmid:34142015
pmcid:PMC8206403
fatcat:tmxwmx7cnfcgbgvzmldpdczkqm
Unsupervised and Scalable Algorithm for Learning Node Representations
2017
International Conference on Learning Representations
In this work, we propose a new unsupervised and efficient method, called here Neighborhood Based Node Embeddings (NBNE), capable of generating node embeddings for very large graphs. ...
This method is based on SkipGram and uses nodes' neighborhoods as contexts to generate representations. ...
To train NBNE on this task, we first obtained a sub-graph with 90% randomly select edges from each dataset, and obtained the node embeddings by training NBNE on this sub-graph. ...
dblp:conf/iclr/PimentelVZ17
fatcat:4bjtsdnrenc2bcz3znbwhfs2da
Enhancing Real-World Complex Network Representations with Hyperedge Augmentation
[article]
2024
arXiv
pre-print
In this paper, we present Hyperedge Augmentation (HyperAug), a novel graph augmentation method that constructs virtual hyperedges directly form the raw data, and produces auxiliary node features by extracting ...
Existing graph augmentation methods mainly perturb the graph structures and are usually limited to pairwise node relations. ...
Node features are extracted based on the textual description of the products. ...
arXiv:2402.13033v1
fatcat:kmmjtz7njraxpj3zqc5lpngkxu
MultiBiSage: A Web-Scale Recommendation System Using Multiple Bipartite Graphs at Pinterest
[article]
2022
arXiv
pre-print
Graph Convolutional Networks (GCN) can efficiently integrate graph structure and node features to learn high-quality node embeddings. ...
MultiBiSage can capture the graph structure of multiple bipartite graphs to learn high-quality pin embeddings. ...
The first/second-order proximity-based methods define proximity based on the edge-weights or the number of times the nodes are part of a subgraph or meta-graphs [34] . ...
arXiv:2205.10666v1
fatcat:3pf3hwzyc5hubfqjqifmefh2zy
« Previous
Showing results 1 — 15 out of 83,581 results