Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








129,228 Hits in 5.6 sec

Literature Review on Feature Selection Methods for High-Dimensional Data

D. Asir, S. Appavu, E. Jebamalar
2016 International Journal of Computer Applications  
Feature selection plays a significant role in improving the performance of the machine learning algorithms in terms of reducing the time to build the learning model and increasing the accuracy in the learning  ...  Keywords Introduction to variable and feature selection, information gain-based feature selection, gain ratio-based feature selection, symmetric uncertainty-based feature selection, subset-based feature  ...  Feature Selection Based on the Supervised Learning Algorithm Used This section reviews various methods of feature selection based on the machine learning algorithm used.  ... 
doi:10.5120/ijca2016908317 fatcat:fi3dkzxwnjgp5mop6xdr5luaze

TVDIM: Enhancing Image Self-Supervised Pretraining via Noisy Text Data [article]

Pengda Qin, Yuhong Li, Kefeng Deng, Qiang Wu
2021 arXiv   pre-print
Considering the information gap between inter-modality features pairs from data noise, we adopt a ranking-based contrastive learning to optimize the mutual information.  ...  Our core idea of self-supervised learning is to maximize the mutual information between features extracted from multiple views of a shared context to a rational degree.  ...  learning based on transformer network.  ... 
arXiv:2106.01797v2 fatcat:ut3dcxos7bhelb6vllwo5dicti

Machine Learning Based Computational Gene Selection Models: A Survey, Performance Evaluation, Open Issues, and Future Research Directions

Nivedhitha Mahendran, P. M. Durai Raj Vincent, Kathiravan Srinivasan, Chuan-Yu Chang
2020 Frontiers in Genetics  
The study categorizes various feature selection algorithms under Supervised, Unsupervised, and Semi-supervised learning.  ...  This paper does an extensive review of the various works done on Machine Learning-based gene selection in recent years, along with its performance analysis.  ...  The Gene Selection based on machine learning can be classified into three types, Supervised, Unsupervised, and Semi-Supervised.  ... 
doi:10.3389/fgene.2020.603808 pmid:33362861 pmcid:PMC7758324 fatcat:jhyfsc72tngwhnrl4vxg3k4tii

Supervised Infinite Feature Selection [article]

Sadegh Eskandari, Emre Akbas
2017 arXiv   pre-print
In this paper, we present a new feature selection method that is suitable for both unsupervised and supervised problems.  ...  We build upon the recently proposed Infinite Feature Selection (IFS) method where feature subsets of all sizes (including infinity) are considered. We extend IFS in two ways.  ...  For supervised feature selection, standardizing the data and then using mutual information based relevance in combination with SPR based redundancy gives the best classification performance.  ... 
arXiv:1704.02665v3 fatcat:jacj7tawbnh6da4qyf5wnxmwfy

Graph Cross Networks with Vertex Infomax Pooling [article]

Maosen Li, Siheng Chen, Ya Zhang, Ivor W. Tsang
2020 arXiv   pre-print
The proposed VIPool selects the most informative subset of vertices based on the neural estimation of mutual information between vertex features and neighborhood features.  ...  Based on trainable hierarchical representations of a graph, GXN enables the interchange of intermediate features across scales to promote information flow.  ...  Compared to these mutual-information-based studies, the proposed VIPool, which also leverages mutual information maximization on graphs, aims to obtain an optimization for vertex selection by finding the  ... 
arXiv:2010.01804v2 fatcat:c3ft5pozizbihaot76dkcfwl2y

Clustering-Based Feature Selection in Semi-supervised Problems

Ianisse Quinzán, José M. Sotoca, Filiberto Pla
2009 2009 Ninth International Conference on Intelligent Systems Design and Applications  
This method selects variables using a feature clustering strategy, using a combination of supervised and unsupervised feature distance measure, which is based on Conditional Mutual Information and Conditional  ...  In this contribution a feature selection method in semi-supervised problems is proposed.  ...  CONCLUSIONS In this paper, a filter feature selection technique based on information theory for semi-supervised problems has been proposed.  ... 
doi:10.1109/isda.2009.211 dblp:conf/isda/QuinzanSP09 fatcat:qm5llihvrnhihkxaslsjq6aeuq

SUGAR: Subgraph Neural Network with Reinforcement Pooling and Self-Supervised Mutual Information Mechanism [article]

Qingyun Sun, Jianxin Li, Hao Peng, Jia Wu, Yuanxing Ning, Phillip S. Yu, Lifang He
2021 arXiv   pre-print
by maximizing their mutual information.  ...  This paper presents a novel hierarchical subgraph-level selection and embedding based graph neural network for graph classification, namely SUGAR, to learn more discriminative subgraph representations  ...  sketched graph and learn subgraph embeddings by an attention mechanism and a self-supervised mutual information mechanism.  ... 
arXiv:2101.08170v3 fatcat:62nxzvfqfnb75ibuoo2b432x64

Hypergraph Spectra for Semi-supervised Feature Selection [chapter]

Zhihong Zhang, Edwin R. Hancock, Xiao Bai
2012 Lecture Notes in Computer Science  
Most existing feature selection methods focus on ranking individual features based on a utility criterion, and select the optimal feature set in a greedy manner.  ...  In this paper, we propose a novel hypergraph based semi-supervised feature selection algorithm to select relevant features using both labeled and unlabeled data.  ...  An important line of research in this area is the use of methods based on mutual information.  ... 
doi:10.1007/978-3-642-33460-3_19 fatcat:h6dspslxybbodbhumd4bmjnw4e

Can feature information interaction help for information fusion in multimedia problems?

Jana Kludas, Eric Bruno, Stéphane Marchand-Maillet
2008 Multimedia tools and applications  
The article presents the information-theoretic based feature information interaction, a measure that can describe complex feature dependencies in multivariate settings.  ...  In experiments with artificial and real data we compare the empirical estimates of correlation, mutual information and 3-way feature interaction.  ...  It can be exploited e.g. for feature selection in supervised learning or for feature construction in the unsupervised case.  ... 
doi:10.1007/s11042-008-0251-y fatcat:jeytzmxp6fdivn7puso75ukraq

Self-supervised Adversarial Training [article]

Kejiang Chen, Hang Zhou, Yuefeng Chen, Xiaofeng Mao, Yuhong Li, Yuan He, Hui Xue, Weiming Zhang, Nenghai Yu
2020 arXiv   pre-print
To escape from the predicament, many works try to harden the model in various ways, in which adversarial training is an effective way which learns robust feature representation so as to resist adversarial  ...  To further strengthen the defense ability, self-supervised adversarial training is proposed, which maximizes the mutual information between the representations of original examples and the corresponding  ...  Deep Infomax (DIM) [23] maximizes mutual information between global features and local features.  ... 
arXiv:1911.06470v2 fatcat:4hvazvht7rdnldvjjaoomqzwcy

Deep Comprehensive Correlation Mining for Image Clustering [article]

Jianlong Wu, Keyu Long, Fei Wang, Chen Qian, Cheng Li, Zhouchen Lin, Hongbin Zha
2019 arXiv   pre-print
data from three aspects: 1) Instead of only using pair-wise information, pseudo-label supervision is proposed to investigate category information and learn discriminative features. 2) The features' robustness  ...  for clustering problem to lift the recently discovered instance-level deep mutual information to a triplet-level formation, which further helps to learn more discriminative features.  ...  Analogous to supervised learning, this approach lifts the instance-level mutual information supervision to triplet-level supervision.  ... 
arXiv:1904.06925v3 fatcat:ljw43lvlcbcxxetf7wfgpupqfe

A Brief Summary of Interactions Between Meta-Learning and Self-Supervised Learning [article]

Huimin Peng
2021 arXiv   pre-print
In self-supervised learning, data augmentation techniques are widely applied and data labels are not required since pseudo labels can be estimated from trained models on similar tasks.  ...  Self-supervised learning utilizes self-supervision from original data and extracts higher-level generalizable features through unsupervised pre-training or optimization of contrastive loss objectives.  ...  I am grateful for valuable publications from Jacques Pitrat, which describe his research work on artificial general intelligence in detail.  ... 
arXiv:2103.00845v2 fatcat:soq6tfl56vgshebtnot57e4qwe

Machine Learning Based Supervised Feature Selection Algorithm for Data Mining

2019 VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE  
In this paper, feature selection for supervised algorithms in data mining are considered and given an overview of existing machine learning algorithm for supervised feature selection.  ...  Data Scientists focus on high dimensional data to predict and reveal some interesting patterns as well as most useful information to the modern world.  ...  Machine Learning Based Supervised Feature Selection Algorithm for Data Mining RFE is a supervised learning technique, it operates as follows [6] [7],It first trains the classifier with original feature  ... 
doi:10.35940/ijitee.j9483.0881019 fatcat:d2sn5qqyrfaj3fpddarejf7qpi

Semisupervised Association Learning Based on Partial Differential Equations for Sparse Representation of Image Class Attributes

Wei Song, Guang Hu, Liuqing OuYang, Zhenjie Zhu, Miaochao Chen
2021 Advances in Mathematical Physics  
In this paper, we propose a multitask multiview semisupervised learning model based on partial differential equation random field and Hilbert independent standard probability image genus attribute model  ...  labels in the label space and the correlations between hidden features and labels.  ...  calculates their mutual information with the marker space based on the size of the mutual information the sequential arrangement of features obtains the relevant feature subset; and then further considers  ... 
doi:10.1155/2021/4784411 fatcat:qhxwsag4rzfuvcazfzscrl2dbe

Constrained Multiview Representation for Self-supervised Contrastive Learning [article]

Siyuan Dai, Kai Ye, Kun Zhao, Ge Cui, Haoteng Tang, Liang Zhan
2024 arXiv   pre-print
In this work, we introduce a novel approach predicated on representation distance-based mutual information (MI) maximization for measuring the significance of different views, aiming at conducting more  ...  Specifically, we harness multi-view representations extracted from the frequency domain, re-evaluating their significance based on mutual information across varying frequencies, thereby facilitating a  ...  CONCLUSION In this study, we introduce a novel self-supervised contrastive learning approach, augmented by a mutual information-based feature selection mechanism, for enhanced representation learning in  ... 
arXiv:2402.03456v1 fatcat:rvxpdxygsvbyvgsatujebknc6q
« Previous Showing results 1 — 15 out of 129,228 results