A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Complete Dictionary Recovery Using Nonconvex Optimization
2015
International Conference on Machine Learning
Our algorithm is based on nonconvex optimization with a spherical constraint, and hence is naturally phrased in the language of manifold optimization. ...
This recovery setting is central to the theoretical understanding of dictionary learning. ...
For exact recovery, we use a simple linear programming rounding procedure, which guarantees to exactly produce the optimizer q . We then use deflation to sequentially recover other rows of X 0 . ...
dblp:conf/icml/SunQW15
fatcat:ko6g4tqutbexlmu6vbzxxg6eby
Finding the Sparsest Vectors in a Subspace: Theory, Algorithms, and Applications
[article]
2020
arXiv
pre-print
recovery, dictionary learning, sparse blind deconvolution, and many other problems in signal processing and machine learning. ...
In this paper, we overview recent advances on global nonconvex optimization theory for solving this problem, ranging from geometric analysis of its optimization landscapes, to efficient optimization algorithms ...
Wright, “Complete dictionary recovery over the sphere,” arXiv preprint arXiv:1504.06785,
2015.
[20] S. Burer and R. D. ...
arXiv:2001.06970v1
fatcat:zluhhl3635bzrnnk7fjw5tvi7a
From Symmetry to Geometry: Tractable Nonconvex Problems
[article]
2022
arXiv
pre-print
The optimization problems encountered in practice are often nonconvex. ...
As science and engineering have become increasingly data-driven, the role of optimization has expanded to touch almost every stage of the data analysis pipeline, from signal and data acquisition to modeling ...
This basic issue affects both for the well-posedness of the matrix completion problem and for our ability to solve it globally using nonconvex optimization. ...
arXiv:2007.06753v4
fatcat:l3kursnwwjc23l4opu235a3reu
Continuous compressed sensing with a single or multiple measurement vectors
2014
2014 IEEE Workshop on Statistical Signal Processing (SSP)
In this paper, a link between CCS and low rank matrix completion (LRMC) is established based on an ℓ_0-pseudo-norm-like formulation, and theoretical guarantees for exact recovery are analyzed. ...
Practically efficient algorithms are proposed based on the link and convex and nonconvex relaxations, and validated via numerical simulations. ...
We propose convex optimization methods for signal recovery based on the link and convex and nonconvex relaxations and present computationally efficient algorithms using alternating direction method of ...
doi:10.1109/ssp.2014.6884632
dblp:conf/ssp/YangX14
fatcat:wjsqtvexb5cwxhjhnowa7fdqt4
Dictionary Learning with BLOTLESS Update
2020
IEEE Transactions on Signal Processing
Numerical simulations show that the bounds approximate well the number of training data needed for exact dictionary recovery. ...
In the error free case, three necessary conditions for exact recovery are identified. ...
Both figures include the cases of complete and over-complete dictionaries. ...
doi:10.1109/tsp.2020.2971948
fatcat:pmzbz4r6uzdsnpygrsgmadneqa
Dictionary Learning with BLOTLESS Update
[article]
2020
arXiv
pre-print
Numerical simulations show that the bounds approximate well the number of training data needed for exact dictionary recovery. ...
In the error free case, three necessary conditions for exact recovery are identified. ...
Both figures include the cases of complete and over-complete dictionaries. ...
arXiv:1906.10211v3
fatcat:u4mbak3xwzfstfvnsjvw56tntm
A Survey on Nonconvex Regularization Based Sparse and Low-Rank Recovery in Signal Processing, Statistics, and Machine Learning
[article]
2019
arXiv
pre-print
In recent, nonconvex regularization based sparse and low-rank recovery is of considerable interest and it in fact is a main driver of the recent progress in nonconvex and nonsmooth optimization. ...
We present recent developments of nonconvex regularization based sparse and low-rank recovery in these fields, addressing the issues of penalty selection, applications and the convergence of nonconvex ...
Section V discusses nonconvex regularized low-rank recovery problems, including matrix completion and robust PCA. ...
arXiv:1808.05403v3
fatcat:lfq3t5gvgngmllu27ml7xnehtm
Sparse Signal Recovery by Difference of Convex Functions Algorithms
[chapter]
2013
Lecture Notes in Computer Science
This paper deals with the problem of signal recovery which is formulated as a l0-minimization problem. ...
Using two appropriate continuous approximations of l0 − norm, we reformulate the problem as a DC (Difference of Convex functions) program. ...
Let us firstly give some basic definitions and notations in CS. For a complete study of CS the reader is referred to [8] and the references therein. ...
doi:10.1007/978-3-642-36543-0_40
fatcat:76n7y5lwuvew7fkja6dh7q6vc4
A New Theory for Matrix Completion
2017
Neural Information Processing Systems
Equipped with this new tool, we prove a series of theorems for missing data recovery and matrix completion. ...
In particular, we prove that the exact solutions that identify the target matrix are included as critical points by the commonly used nonconvex programs. ...
Acknowledgment We would like to thanks the anonymous reviewers and meta-reviewers for providing us many valuable comments to refine this paper. ...
dblp:conf/nips/LiuLY17
fatcat:4j2xuuliezcnfbb4vhfpmrcroi
Learning Sparsely Used Overcomplete Dictionaries via Alternating Minimization
2016
SIAM Journal on Optimization
Combined with the recent results of approximate dictionary estimation, this yields provable guarantees for exact recovery of both the dictionary elements and the coefficients, when the dictionary elements ...
Typically, the coefficients are estimated via 1 minimization, keeping the dictionary fixed, and the dictionary is estimated through least squares, keeping the coefficients fixed. ...
optimization problem for dictionary recovery. ...
doi:10.1137/140979861
fatcat:mzd6f2ovjzhrrnzpetubpu5bpy
Robustness Analysis of Structured Matrix Factorization via Self-Dictionary Mixed-Norm Optimization
2016
IEEE Signal Processing Letters
More importantly, we also show that using nonconvex mixed (quasi) norms is more advantageous in terms of robustness against noise. ...
Prior works showed that such a factorization problem can be formulated as a self-dictionary sparse optimization problem under some assumptions that are considered realistic in many applications, and convex ...
Problem (4) is called a self-dictionary sparse formulation because is used as a dictionary to perform sparse optimization. ...
doi:10.1109/lsp.2015.2498523
fatcat:rmynz4j2wfadharlcbtm2pighm
Fast Learning with Nonconvex L1-2 Regularization
[article]
2017
arXiv
pre-print
Convex regularizers are often used for sparse learning. They are easy to optimize, but can lead to inferior prediction performance. ...
The difference of ℓ_1 and ℓ_2 (ℓ_1-2) regularizer has been recently proposed as a nonconvex regularizer. It yields better recovery than both ℓ_0 and ℓ_1 regularizers on compressed sensing. ...
Finally, for the nonconvex regularization, we use 1) FaNCL [25] : The state-of-the-art solver for matrix completion with nonconvex regularizers. ...
arXiv:1610.09461v3
fatcat:bauk2i5kejht7bnhz7r6cjwsuq
A Primal-Dual Analysis of Global Optimality in Nonconvex Low-Rank Matrix Recovery
2018
International Conference on Machine Learning
We propose a primal-dual based framework for analyzing the global optimality of nonconvex lowrank matrix recovery. ...
completion. ...
For instance, studied the nonconvex geometry of complete dictionary recovery problem, and proved that all local minima are global ones. ...
dblp:conf/icml/ZhangWYG18
fatcat:mgruwfzf3rhhtp3ntfj5ghktem
Monocular 3D Pose Recovery via Nonconvex Sparsity with Theoretical Analysis
[article]
2018
arXiv
pre-print
For recovering 3D object poses from 2D images, a prevalent method is to pre-train an over-complete dictionary D={B_i}_i^D of 3D basis poses. ...
noises, optimization times. ...
the recovery results of dictionary-based methods, and moving towards the optimal speed-accuracy trade-off. ...
arXiv:1812.11295v1
fatcat:e6d2llb72jgybfiaje3ueqhdau
A Novel Robust Principal Component Analysis Algorithm of Nonconvex Rank Approximation
2020
Mathematical Problems in Engineering
Noise exhibits low rank or no sparsity in the low-rank matrix recovery, and the nuclear norm is not an accurate rank approximation of low-rank matrix. ...
In the present study, to solve the mentioned problem, a novel nonconvex approximation function of the low-rank matrix was proposed. ...
Optimal Solution Is Block Diagonal. In the selection of a right dictionary, the lowest rank representation will reveal the true segmentation result. ...
doi:10.1155/2020/9356935
fatcat:emkxouiv2rhubhfokxrcrt4s6q
« Previous
Showing results 1 — 15 out of 955 results