Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Filters








11,950 Hits in 7.9 sec

Finding a Sparse Vector in a Subspace: Linear Sparsity Using Alternating Directions

Qing Qu, Ju Sun, John Wright
2016 IEEE Transactions on Information Theory  
Is it possible to find the sparsest vector (direction) in a generic subspace S⊆R^p with dim(S)= n < p?  ...  In this paper, we focus on a **planted sparse model** for the subspace: the target sparse vector is embedded in an otherwise random subspace.  ...  Finding a sparse vector in a subspace: Linear sparsity using alternating directions. In Advances in Neural Information Processing Systems, 2014.[SQW14] Ju Sun, Qing Qu, and John Wright.  ... 
doi:10.1109/tit.2016.2601599 fatcat:s4xr2mmy2jcxnlfpi7u2dopqgy

Directing Power Towards Conic Parameter Subspaces [article]

Nick Koning
2019 arXiv   pre-print
I illustrate the statistic on subspaces that consist of sparse or nearly-sparse vectors, for which the computation corresponds to ℓ_0- and ℓ_1-regularized regression, respectively.  ...  I simultaneously address these two issues by proposing a novel test statistic that is large in a conic parameter subspace of interest.  ...  This allows us to pick C in such a way that power is directed towards a specific alternative of interest.  ... 
arXiv:1907.05077v6 fatcat:jtnplgzbsrhmfizke2yifvcs3a

Kernel sparse subspace clustering

Vishal M. Patel, Rene Vidal
2014 2014 IEEE International Conference on Image Processing (ICIP)  
We show that the alternating direction method of multipliers can be used to efficiently find kernel sparse representations.  ...  In this paper, we extend SSC to non-linear manifolds by using the kernel trick.  ...  The above problems can be efficiently solved by using the classical alternating direction method of multipliers (ADMM) [16] .  ... 
doi:10.1109/icip.2014.7025576 dblp:conf/icip/PatelV14 fatcat:g5hpcmf6t5dyzi6znuvdvniegi

Sparse subspace clustering

E. Elhamifar, R. Vidal
2009 2009 IEEE Conference on Computer Vision and Pattern Recognition  
We propose a method based on sparse representation (SR) to cluster data drawn from multiple low-dimensional linear or affine subspaces embedded in a high-dimensional space.  ...  Our method is based on the fact that each point in a union of subspaces has a SR with respect to a dictionary formed by all other data points. In general, finding such a SR is NP hard.  ...  We have presented a novel approach to subspace clustering based on sparse representation.  ... 
doi:10.1109/cvprw.2009.5206547 fatcat:3yogs2jzirhl3npbdj4i2f7rmq

Sparse subspace clustering

Ehsan Elhamifar, Rene Vidal
2009 2009 IEEE Conference on Computer Vision and Pattern Recognition  
We propose a method based on sparse representation (SR) to cluster data drawn from multiple low-dimensional linear or affine subspaces embedded in a high-dimensional space.  ...  Our method is based on the fact that each point in a union of subspaces has a SR with respect to a dictionary formed by all other data points. In general, finding such a SR is NP hard.  ...  We have presented a novel approach to subspace clustering based on sparse representation.  ... 
doi:10.1109/cvpr.2009.5206547 dblp:conf/cvpr/ElhamifarV09 fatcat:wlbox6rlpfhrzobtdm6fekelwa

Face Subspace Learning [chapter]

Wei Bian, Dacheng Tao
2011 Handbook of Face Recognition  
The earliest subspace method for face recognition is Eigenface [43] , which uses PCA [23] to select the most representative subspace for representing a set of face images.  ...  By projecting face images onto the subspace spanned by Eigenface, classifiers can be used in the subspace for recognition. One main limitation of Eigenface is that the  ...  There a query is projected onto a linear subspace spanned by a set of basis vectors, where the basis vectors can be any form from a subspace analysis or a set of local features, and the distance between  ... 
doi:10.1007/978-0-85729-932-1_3 fatcat:ot7fkakworamtavm4jwlp4sfjm

Robust Subspace Learning: Robust PCA, Robust Subspace Tracking, and Robust Subspace Recovery

Namrata Vaswani, Thierry Bouwmans, Sajid Javed, Praneeth Narayanamurthy
2018 IEEE Signal Processing Magazine  
The S+LR formulation instead assumes that outliers occur on only a few data vector indices and hence are well modeled as sparse corruptions.  ...  PCA is one of the most widely used dimension reduction techniques. A related easier problem is "subspace learning" or "subspace estimation".  ...  GOSUS (Grassmannian Online Subspace Updates with Structured-sparsity) [53] is another incremental algorithm that uses structured sparsity of the outlier terms in conjunction with a GRASTA-like (or ReProCS-like  ... 
doi:10.1109/msp.2018.2826566 fatcat:4fscwwy7rjbjrm3xoztm5api2i

Portfolio diversification using subspace factorizations

Ruairi de Frein, Konstantinos Drakakis, Scott Rickard
2008 2008 42nd Annual Conference on Information Sciences and Systems  
In this work we contribute a new approach to portfolio diversification by comparing a recently developed clustering technique, SemiNMF, with a new sparse low-rank approximate factorization technique, Sparse-semiNMF  ...  We evaluate these techniques using a diffusion model based on the Black-Scholes options pricing model.  ...  PCA algorithms only use second order statistics and give projections of the data in the direction of maximum variance in the remaining orthogonal subspaces.  ... 
doi:10.1109/ciss.2008.4558678 dblp:conf/ciss/FreinDR08 fatcat:tx67ojac4rb4tomn2promozmm4

5. Krylov Subspace Methods [chapter]

2000 Trust Region Methods  
For large sparse symmetric linear systems arising in topology optimization, Krylov subspace methods are required.  ...  Therefore, recycling a subspace of the Krylov subspace and using it to solve the next system can improve the convergence rate significantly.  ...  For large sparse symmetric linear systems arising in topology optimization, Krylov subspace methods are required.  ... 
doi:10.1137/1.9780898719857.ch5 fatcat:thyxqdt2wrf5vnxclogpcsgg3a

A Fast deflation Method for Sparse Principal Component Analysis via Subspace Projections [article]

Cong Xu, Min Yang, Jin Zhang
2019 arXiv   pre-print
In this paper, a series of subspace projections are constructed efficiently by using Household QR factorization.  ...  The implementation of conventional sparse principal component analysis (SPCA) on high-dimensional data sets has become a time consuming work.  ...  Given a data set, PCA aims at finding a sequence of orthogonal vectors that represent the directions of largest variance.  ... 
arXiv:1912.01449v2 fatcat:fiilhiasfzdzjnoigfky7nro4e

Tensor LRR and Sparse Coding-Based Subspace Clustering

Yifan Fu, Junbin Gao, David Tien, Zhouchen Lin, Xia Hong
2016 IEEE Transactions on Neural Networks and Learning Systems  
Subspace clustering groups a set of samples from a union of several linear subspaces into clusters, so that the samples in the same cluster are drawn from the same linear subspace.  ...  The affinity matrix used for spectral clustering is built from the joint similarities in both spatial and feature spaces.  ...  The general process of the BCD is shown in Algorithm 1. We use the linearized alternating direction method (LADM) [30] to solve the constrained optimization problem (15) .  ... 
doi:10.1109/tnnls.2016.2553155 pmid:27164609 fatcat:rdyohb5qybgxvipppbdv5wepyq

Robust Subspace Recovery via Bi-Sparsity Pursuit [article]

Xiao Bian, Hamid Krim
2014 arXiv   pre-print
In this paper, we propose a bi-sparse model as a framework to analyze this problem and provide a novel algorithm to recover the union of subspaces in presence of sparse corruptions.  ...  Successful applications of sparse models in computer vision and machine learning imply that in many real-world applications, high dimensional data is distributed in a union of low dimensional subspaces  ...  In this section, we leverage the successes of alternating direction method (ADM) [9] and linearized ADM (LADM) [10] in large scale sparse representation problem, and focus on designing an appropriate  ... 
arXiv:1403.8067v2 fatcat:zko7ckhw3rdbngpqo6pvzye73a

Efficient Solvers for Sparse Subspace Clustering [article]

Farhad Pourkamali-Anaraki and James Folberth and Stephen Becker
2020 arXiv   pre-print
Using ℓ_1 regularization results in a convex problem but requires O(n^2) storage, and is typically solved by the alternating direction method of multipliers which takes O(n^3) flops.  ...  Sparse subspace clustering (SSC) clusters n points that lie near a union of low-dimensional subspaces.  ...  Sparse subspace clustering (SSC) approaches the problem of finding subspace-preserving coefficients by enforcing a sparsity prior on the columns of the matrix C.  ... 
arXiv:1804.06291v2 fatcat:pprn74ulyjfyhgarz262wxlix4

Latent Space Sparse Subspace Clustering

Vishal M. Patel, Hien Van Nguyen, Rene Vidal
2013 2013 IEEE International Conference on Computer Vision  
We propose a novel algorithm called Latent Space Sparse Subspace Clustering for simultaneous dimensionality reduction and clustering of data lying in a union of subspaces.  ...  Specifically, we describe a method that learns the projection of data and finds the sparse coefficients in the low-dimensional latent space.  ...  alternating direction method of multipliers (ADMM) [6] .  ... 
doi:10.1109/iccv.2013.35 dblp:conf/iccv/PatelNV13 fatcat:lgxzosjv4rfgdkdtnwqcuulp24

Learning Document Representations Using Subspace Multinomial Model

Santosh Kesiraju, Lukáš Burget, Igor Szőke, Jan Černocký
2016 Interspeech 2016  
Subspace multinomial model (SMM) is a log-linear model and can be used for learning low dimensional continuous representation for discrete data.  ...  In this paper, we propose a new variant of SMM that introduces sparsity and call the resulting model as 1 SMM.  ...  In [11] , sparse log-linear models were used to learn a small and useful feature space for dialogue-act classification.  ... 
doi:10.21437/interspeech.2016-1634 dblp:conf/interspeech/KesirajuBSC16 fatcat:pjhvtk6khvb3zkaqup2vhsk7ie
« Previous Showing results 1 — 15 out of 11,950 results