Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








6,475 Hits in 6.6 sec

A Multiple Maximum Scatter Difference Discriminant Criterion for Facial Feature Extraction

Fengxi Song, D. Zhang, Dayong Mei, Zhongwei Guo
2007 IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)  
Unlike most other subspace-based feature-extraction methods, the MMSD computes its discriminant vectors from both the range of the between-class scatter matrix and the null space of the within-class scatter  ...  The MMSD featureextraction method, which is based on this novel discriminant criterion, is a new subspace-based feature-extraction method.  ...  In order to eliminate the influences of the lengths of the discriminant vectors and linear dependences between these vectors, we usually require them to be orthonormal.  ... 
doi:10.1109/tsmcb.2007.906579 pmid:18179076 fatcat:77vteyxhzvcwfnqkdpn75whw4q

A Rank-One Update Algorithm for Fast Solving Kernel Foley–Sammon Optimal Discriminant Vectors

Wenming Zheng, Zhouchen Lin, Xiaoou Tang
2010 IEEE Transactions on Neural Networks  
The FSODVs method outperforms the classic Fisher linear discriminant analysis (FLDA) method in the sense that it can solve more discriminant vectors for recognition.  ...  A popular method is the Foley-Sammon optimal discriminant vectors (FSODVs) method, which aims to find an optimal set of discriminant vectors that maximize the Fisher discriminant criterion under the orthogonal  ...  ACKNOWLEDGMENT The authors would like to thank one of the anonymous reviewers for pointing out the possibility of using the ROU technique for matrix inverses.  ... 
doi:10.1109/tnn.2009.2037149 pmid:20089474 fatcat:7bimsxsapjhktgnxtvbteskjae

Why can LDA be performed in PCA transformed space?

Jian Yang, Jing-yu Yang
2003 Pattern Recognition  
PCA plus LDA is a popular framework for linear discriminant analysis (LDA) in high dimensional and singular case. In this paper, we focus on building a theoretical foundation for this framework.  ...  Moreover, we point out the weakness of the previous LDA based methods, and suggest a complete PCA plus LDA algorithm.  ...  In this paper, we intend to solve these problems and build a theoretical foundation for the PCA plus LDA method.  ... 
doi:10.1016/s0031-3203(02)00048-1 fatcat:ahzm3urswfcxhjwfzftzzqiiaq

Learning Discriminative Canonical Correlations for Object Recognition with Image Sets [chapter]

Tae-Kyun Kim, Josef Kittler, Roberto Cipolla
2006 Lecture Notes in Computer Science  
Specifically, inspired by classical Linear Discriminant Analysis (LDA), we develop a linear discriminant function that maximizes the canonical correlations of within-class sets and minimizes the canonical  ...  As a way of comparing sets of vectors or images, canonical correlations offer many benefits in accuracy, efficiency, and robustness compared to the classical parametric distribution-based and non-parametric  ...  Acknowledgements The authors acknowledge the support of the Toshiba Corporation and the Chevening Scholarship.  ... 
doi:10.1007/11744078_20 fatcat:r5dvs5bsyrdwbmw2p4te447dlq

Maximum Entropy Linear Manifold for Learning Discriminative Low-Dimensional Representation [chapter]

Wojciech Marian Czarnecki, Rafal Jozefowicz, Jacek Tabor
2015 Lecture Notes in Computer Science  
MELM provides highly discriminative 2D projections of the data which can be used as a method for constructing robust classifiers.  ...  Representation learning is currently a very hot topic in modern machine learning, mostly due to the great success of the deep learning methods.  ...  The work has been partially financed by National Science Centre Poland grant no. 2014/13/B/ST6/01792.  ... 
doi:10.1007/978-3-319-23528-8_4 fatcat:owyzr6uxijahvfe2ogwhxox6dm

EFFICIENT PSEUDOINVERSE LINEAR DISCRIMINANT ANALYSIS AND ITS NONLINEAR FORM FOR FACE RECOGNITION

JUN LIU, SONGCAN CHEN, XIAOYANG TAN, DAOQIANG ZHANG
2007 International journal of pattern recognition and artificial intelligence  
than the recently proposed Linear Discriminant Analysis via QR decomposition and Discriminant Common Vectors.  ...  Pseudoinverse Linear Discriminant Analysis (PLDA) is a classical and pioneer method that deals with the Small Sample Size (SSS) problem in LDA when applied to such application as face recognition.  ...  RSw+LDA relaxes the demanding computational and storage requirement in PLDA, and thus makes possible the comparison between PLDA and other methods.  ... 
doi:10.1142/s0218001407005946 fatcat:plw7tyi3rjghtlcnbxsltd4jda

Page 5122 of Mathematical Reviews Vol. , Issue 911 [page]

1991 Mathematical Reviews  
However, it has not been discussed theoretically in comparison with a well-known dis- criminant analysis.  ...  Okada and Tomita presented an orthonormal discriminant vec- tor method based on the Fisher criterion. The advantage of their method has been discussed experimentally.  ... 

Discriminant analysis based on projection onto generalized difference subspace [article]

Kazuhiro Fukui, Naoya Sogi, Takumi Kobayashi, Jing-Hao Xue, Atsuto Maki
2019 arXiv   pre-print
Furthermore, to enhance the performances of gFDA and GDS projection, we normalize the projected vectors on the discriminant spaces.  ...  This paper discusses a new type of discriminant analysis based on the orthogonal projection of data onto a generalized difference subspace (GDS).  ...  We would like to thank Hideitsu Hino and Rui Zhu for their helpful comments.  ... 
arXiv:1910.13113v2 fatcat:dyyy4vaxqnadtbzfny2t6c42fy

Geometric Distribution Weight Information Modeled Using Radial Basis Function with Fractional Order for Linear Discriminant Analysis Method

Wen-Sheng Chen, Chu Zhang, Shengyong Chen
2013 Advances in Mathematical Physics  
Fisher linear discriminant analysis (FLDA) is a classic linear feature extraction and dimensionality reduction approach for face recognition.  ...  samples and proposes a novel geometric distribution weight information based Fisher discriminant criterion.  ...  The authors would like to thank the Olivetti Research Laboratory and the Amy Research Laboratory for providing the face image databases.  ... 
doi:10.1155/2013/825861 fatcat:wjuxsj2k5rfotadrjms7iwi74e

Decision boundary feature extraction for nonparametric classification

C. Lee, D.A. Landgrebe
1993 IEEE Transactions on Systems, Man and Cybernetics  
Since non-parametric classifiers do not define decision boundaries in analytic form, the decision boundary and normal vectors must be estimated numerically.  ...  We propose a procedure to extract discriminantly informative features based on a decision boundary for non-parametric classification.  ...  One can always find an orthonormal basis for a vector space using the Gram-Schmidt procedure.  ... 
doi:10.1109/21.229456 fatcat:qdojwhhvrjcxtjewayf4tvhbl4

Spectral Normalization for Generative Adversarial Networks [article]

Takeru Miyato, Toshiki Kataoka, Masanori Koyama, Yuichi Yoshida
2018 arXiv   pre-print
In this paper, we propose a novel weight normalization technique called spectral normalization to stabilize the training of the discriminator.  ...  We tested the efficacy of spectral normalization on CIFAR10, STL-10, and ILSVRC2012 dataset, and we experimentally confirmed that spectrally normalized GANs (SN-GANs) is capable of generating images of  ...  We also would like to thank anonymous reviewers and commenters on the OpenReview forum for insightful discussions.  ... 
arXiv:1802.05957v1 fatcat:xxa425wf4faxrga463thngjk6a

Fast linear discriminant analysis using binary bases

Feng Tang, Hai Tao
2007 Pattern Recognition Letters  
The main computation in LDA is the dot product between LDA base vector and the data point which involves costly element-wise floating point multiplications.  ...  The main computation in LDA is the dot product between LDA base vector and the data point which involves costly element-wise floating point multiplication.  ...  Linear discriminant analysis (LDA) is a widely used discriminative method.  ... 
doi:10.1016/j.patrec.2007.07.007 fatcat:gpu627vxiffklifdkqfjwrngby

From Projection Pursuit and CART to Adaptive Discriminant Analysis?

R. Gribonval
2005 IEEE Transactions on Neural Networks  
In this paper, we try to advocate the idea that such developments and efforts are worthwhile, based on the theorerical study of a data-driven discriminant analysis method on a simple-yet instructive-example  ...  Unlike the linear discriminant analysis (LDA) strategy, which selects subspaces that do not depend on the observed signal, we consider an adaptive sequential selection of projections, in the spirit of  ...  Our contribution to this development consists in the mathematical analysis of an ADA method-which is based on an information theoretic optimization criterion-in the context of discrimination between two  ... 
doi:10.1109/tnn.2005.844900 pmid:15940983 fatcat:uya2k63oy5fancni7tsais4anm

KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition

Jian Yang, A.F. Frangi, Jing-Yu Yang, David Zhang, Zhong Jin
2005 IEEE Transactions on Pattern Analysis and Machine Intelligence  
This framework provides novel insights into the nature of KFD. Based on this framework, the authors propose a complete kernel Fisher discriminant analysis (CKFD) algorithm.  ...  This paper examines the theory of kernel Fisher discriminant analysis (KFD) in a Hilbert space and develops a two-phase KFD framework, i.e., kernel principal component analysis (KPCA) plus Fisher linear  ...  Let us begin with the linear discriminant analysis methods. Liu et al.  ... 
doi:10.1109/tpami.2005.33 pmid:15688560 fatcat:ctyjyfjbmne6hn6l3hpzohep6a

Maximization of Mutual Information for Supervised Linear Feature Extraction

Jose Miguel Leiva-Murillo, Antonio Artes-Rodriguez
2007 IEEE Transactions on Neural Networks  
The method is based on the maximization of the mutual information (MI) between the features extracted and the classes.  ...  Then, a component-by-component gradient-ascent method is proposed for the maximization of the MI, similar to the gradient-based entropy optimization used in independent component analysis (ICA).  ...  Some of these methods are linear discriminant analysis (LDA) [3] , sliced inverse regression (SIR) [4] , partial least square regression (PLS) [5] , and canonical correlation analysis (CCA) [6] .  ... 
doi:10.1109/tnn.2007.891630 pmid:18220191 fatcat:cvjjaro3inbb5o36r67zj7nzqy
« Previous Showing results 1 — 15 out of 6,475 results