A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2018; you can also visit the original URL.
The file type is application/pdf
.
Filters
Exploit of online social networks with Semi-Supervised Learning
2010
The 2010 International Joint Conference on Neural Networks (IJCNN)
Recently, Semi-Supervised Learning (SSL), which has the advantage of utilizing fewer labeled data to achieve better performance compared to classical Supervised Learning, attracts much attention from the ...
Online social networks usually contain little public available information of users (labeled data) but with a large number of hidden ones (unlabeled data). ...
Different from supervised learning only with labeled data and unsupervised learning only with unlabeled data, SSL learns knowledge with a small set of labeled data and a much larger set of unlabeled data ...
doi:10.1109/ijcnn.2010.5596580
dblp:conf/ijcnn/MoWLHK10
fatcat:qjxhptyuyvgrnoz23f2s7k6xgu
Semi-supervised convolutional neural networks for human activity recognition
2017
2017 IEEE International Conference on Big Data (Big Data)
Our semi-supervised CNNs learn from both labeled and unlabeled data while also performing feature learning on raw sensor data. ...
Semi-supervised learning augments labeled examples with unlabeled examples, often resulting in improved performance. ...
Semi-supervised learning from both labeled and unlabeled data can thus potentially provide better predictions for human walking in a hurry, compared to supervised learning using only labeled data. ...
doi:10.1109/bigdata.2017.8257967
dblp:conf/bigdataconf/ZengYWNML17
fatcat:xguekm7r5ndllaifl5njbt3qii
Semi-Supervised Convolutional Neural Networks for Human Activity Recognition
[article]
2018
arXiv
pre-print
Our semi-supervised CNNs learn from both labeled and unlabeled data while also performing feature learning on raw sensor data. ...
Semi-supervised learning augments labeled examples with unlabeled examples, often resulting in improved performance. ...
Semi-supervised learning from both labeled and unlabeled data can thus potentially provide better predictions for human walking in a hurry, compared to supervised learning using only labeled data. ...
arXiv:1801.07827v1
fatcat:5rk2julywzai5le62ha5n4yie4
Overcoming Relational Learning Biases to Accurately Predict Preferences in Large Scale Networks
2015
Proceedings of the 24th International Conference on World Wide Web - WWW '15
Further, semi-supervised learning methods could enable RML methods to exploit the large amount of unlabeled data in networks. ...
First, semisupervised methods for RML do not fully utilize all the unlabeled instances in the network. ...
Acknowledgements We thank David Gleich for his help with optimizing the maximization problem for the learning algorithms and Iman Alodah for her help with parts of the data processing. ...
doi:10.1145/2736277.2741668
dblp:conf/www/PfeifferNB15
fatcat:ifuekasgbffk3i5s3ex5z63nm4
Predictive text embedding utilizes both labeled and unlabeled data to learn the embedding of text. ...
One possible reason is that these text embedding methods learn the representation of text in a fully unsupervised way, without leveraging the labeled information available for the task. ...
Acknowledgments Qiaozhu Mei is supported by the National Science Foundation under grant numbers IIS-1054199 and CCF-1048168. ...
doi:10.1145/2783258.2783307
dblp:conf/kdd/TangQM15
fatcat:lmjjjaflz5cxflatwplflftvtq
Learn to Propagate Reliably on Noisy Affinity Graphs
[article]
2020
arXiv
pre-print
Recent works have shown that exploiting unlabeled data through label propagation can substantially reduce the labeling cost, which has been a critical issue in developing visual recognition models. ...
outliers and moves forward the propagation frontier in a prudent way. ...
In early iterations, ∆c τ can often be achieved by a small number of unlabeled vertices, as most vertices are unlabeled and have low confidences. ...
arXiv:2007.08802v1
fatcat:7gas2h2o6rc75inffyf4woblui
Exploit of Online Social Networks with Community-Based Graph Semi-Supervised Learning
[chapter]
2010
Lecture Notes in Computer Science
Recently, Semi-Supervised Learning (SSL), which has the advantage of utilizing the unlabeled data to achieve better performance, attracts much attention from the web research community. ...
With the rapid growth of the Internet, more and more people interact with their friends in online social networks like Facebook 1 . ...
In the whole graph, there are l vertices labeled as Y label and u vertices needed to predict their labelsŶ unlabel . ...
doi:10.1007/978-3-642-17537-4_81
fatcat:wtkdrfpyybbfbbl2v2yhxzke54
Every Node Counts: Self-Ensembling Graph Convolutional Networks for Semi-Supervised Learning
[article]
2018
arXiv
pre-print
model in semi-supervised learning. ...
In such a mutual-promoting process, both labeled and unlabeled samples can be fully utilized for backpropagating effective gradients to train GCN. In three article classification tasks, i.e. ...
In this paper, we propose a new architecture that can discover much more information within unlabeled vertices and learn from the global graph topology. ...
arXiv:1809.09925v1
fatcat:oyhjou5wsvf5jh7oprya2iaqca
Learning to Cluster Faces on an Affinity Graph
[article]
2019
arXiv
pre-print
Face recognition sees remarkable progress in recent years, and its performance has reached a very high level. ...
Specifically, we propose a framework based on graph convolutional network, which combines a detection and a segmentation module to pinpoint face clusters. ...
A natural way is to treat all the vertices whose labels are different from the majority label as outliers. ...
arXiv:1904.02749v2
fatcat:sx5mffduszdg5ct6ifped6evae
Learning to Cluster Faces on an Affinity Graph
2019
2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Face recognition sees remarkable progress in recent years, and its performance has reached a very high level. ...
Specifically, we propose a framework based on graph convolutional network, which combines a detection and a segmentation module to pinpoint face clusters. ...
A natural way is to treat all the vertices whose labels are different from the majority label as outliers. ...
doi:10.1109/cvpr.2019.00240
dblp:conf/cvpr/YangZCYLL19
fatcat:fk4ykuipgjacblni6qhfumlgpq
Using Katz Centrality to Classify Multiple Pattern Transformations
2012
2012 Brazilian Symposium on Neural Networks
Usually, these methods consist of two stages: the construction of a network from the original vector-based data set and the learning in the constructed network. ...
Among the many machine learning methods developed for classification tasks, the network-based learning algorithms made great success. ...
ACKNOWLEDGMENT The authors would like to acknowledge the São Paulo State Research Foundation (FAPESP) and the Brazilian National Council for Scientific and Technological Development (CNPq) for the financial ...
doi:10.1109/sbrn.2012.23
dblp:conf/sbrn/CupertinoZ12a
fatcat:lj5cqdm5ybffri74jn2cttcdii
Exemplar-Based Contrastive Self-Supervised Learning with Few-Shot Class Incremental Learning
[article]
2022
arXiv
pre-print
Humans are capable of learning new concepts from only a few (labeled) exemplars, incrementally and continually. ...
This suggests, in human learning, supervised learning of concepts based on exemplars takes place within the larger context of contrastive self-supervised learning (CSSL) based on unlabeled and labeled ...
This is done in NCL, which is discussed below. NCL proposes a holistic learning framework that uses contrastive loss formulation to learn discriminative features from both labeled and unlabeled data. ...
arXiv:2202.02601v1
fatcat:bwhivxbocraonlmxh25xp2uuea
Tree Energy Loss: Towards Sparsely Annotated Semantic Segmentation
[article]
2022
arXiv
pre-print
By sequentially applying these affinities to the network prediction, soft pseudo labels for unlabeled pixels are generated in a coarse-to-fine manner, achieving dynamic online self-training. ...
are labeled in each image. ...
., cross-entropy loss), any segmentation network can learn extra knowledge from unlabeled regions via dynamic online self-training. ...
arXiv:2203.10739v2
fatcat:mmh6z5d62fg75e6jzlifpeieom
A Novel Semi-Supervised Learning Method Based on Fast Search and Density Peaks
2019
Complexity
However, unlike the existing semi-supervised learning methods, we do not use unlabeled samples directly and, instead, look for safe and reliable unlabeled samples before using them. ...
In this paper, two new semi-supervised learning methods are proposed: a semi-supervised learning method based on fast search and density peaks (S2DP) and an iterative S2DP method (IS2DP). ...
performs stably, probably resulting from unsupervised learning (Autoencoder) embedded in the Ladder Network algorithm which could learn and recognize unlabeled samples and reduce certain interference. ...
doi:10.1155/2019/6876173
fatcat:band2f5vrbglpobo4tpe57lsie
Multi-Stage Self-Supervised Learning for Graph Convolutional Networks on Graphs with Few Labels
[article]
2020
arXiv
pre-print
Graph Convolutional Networks(GCNs) play a crucial role in graph learning tasks, however, learning graph embedding with few supervised signals is still a difficult problem. ...
In this paper, we propose a novel training algorithm for Graph Convolutional Network, called Multi-Stage Self-Supervised(M3S) Training Algorithm, combined with self-supervised learning approach, focusing ...
a fixed number of epoches on the initial labeled and unlabeled set L 0 , U 0 2: for each stage k do 3: Sort vertices on confidence in unlabeled set U k−1 .
4: for each class j do 5: Find the top t vertices ...
arXiv:1902.11038v2
fatcat:3chqp7uzlfhujggjtibyuoqfnm
« Previous
Showing results 1 — 15 out of 16,265 results