A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
On the Complexity of Deterministic Nonsmooth and Nonconvex Optimization
[article]
2022
arXiv
pre-print
In this paper, we present several new results on minimizing a nonsmooth and nonconvex function under a Lipschitz condition. ...
However, the deterministic algorithms have not been fully explored, leaving open several problems in nonsmooth nonconvex optimization. ...
This work was supported in part by the Mathematical Data Science program of the Office of Naval Research under grant number N00014-18-1-2764 and by the Vannevar Bush Faculty Fellowship program under grant ...
arXiv:2209.12463v2
fatcat:2nfeqqfkbnbhlae7q2zpqmdhuy
Efficient Mirror Descent Ascent Methods for Nonsmooth Minimax Problems
2021
Neural Information Processing Systems
In the paper, we propose a class of efficient mirror descent ascent methods to solve the nonsmooth nonconvex-strongly-concave minimax problems by using dynamic mirror functions, and introduce a convergence ...
For our deterministic algorithm, we prove that our deterministic mirror descent ascent (MDA) achieves a lower gradient complexity of O( √ κ −2 ) under mild conditions, which matches the best known complexity ...
Acknowledgments and Disclosure of Funding This work was partially supported by NSF IIS 1845666, 1852606, 1838627, 1837956, 1956002, OIA 2040588. ...
dblp:conf/nips/HuangWH21
fatcat:ghlekvxxqbg2bohtmcfusitdpy
Deterministic Nonsmooth Nonconvex Optimization
[article]
2023
arXiv
pre-print
We study the complexity of optimizing nonsmooth nonconvex Lipschitz functions by producing (δ,ϵ)-stationary points. ...
On the other hand, we prove that if the function is even slightly smooth, then the dimension-free rate of Õ(δ^-1ϵ^-3) can be obtained by a deterministic algorithm with merely a logarithmic dependence on ...
nonsmooth and nonconvex optimization. ...
arXiv:2302.08300v1
fatcat:z2mqwzj3jfaspjsrtgzz4xx4p4
Limited memory discrete gradient bundle method for nonsmooth derivative-free optimization
2012
Optimization
In this article, we propose an efficient derivative-free limited memory discrete gradient bundle method for nonsmooth, possibly nonconvex optimization. ...
In this talk, the speaker will first show some surprising experimental results on smooth-nonsmooth criteria and mathematical theory in nonsmooth mechanics, which may not know to the community of mathematical ...
Billups, Solving the canonical dual of box-and integer-constrained nonconvex quadratic programs via a deterministic direct search algorithm, Optimization Methods and Software, 28, No. 2, April 2013, 313âȂŞ326 ...
doi:10.1080/02331934.2012.687736
fatcat:m2cbfps4h5ckzorry2uhjw6xlm
Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions
[article]
2020
arXiv
pre-print
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonconvex functions. ...
We propose a series of randomized first-order methods and analyze their complexity of finding a (δ, ϵ)-stationary point. ...
The authors thank Ohad Shamir for helpful discussions, and for pointing out the difference between being (δ, ) stationary and being δ close to an stationary point. ...
arXiv:2002.04130v3
fatcat:qmfdypll3zftdktyyiykkmbp5i
An Incremental Path-Following Splitting Method for Linearly Constrained Nonconvex Nonsmooth Programs
[article]
2018
arXiv
pre-print
The stationary point of Problem 2 is NOT the stationary point of Problem 1. We are sorry and we are working on fixing this error. ...
Very recently, Jiang et al. [20] presented a unified framework to define the -stationary solution of nonconvex nonsmooth problems, and presented the iteration complexity of the splitting method in terms ...
Specifically, we use the measure of optimality in terms of variational inequality, instead of the closeness to the optimal solution, which is intractable in nonconvex optimization. ...
arXiv:1801.10119v6
fatcat:tfzeq56xvfeuxoxk3jtfqzqibe
Asynchronous Delay-Aware Accelerated Proximal Coordinate Descent for Nonconvex Nonsmooth Problems
2019
PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE
To the best of our knowledge, we are the first to provide stochastic and deterministic accelerated extension of APCD algorithms for general nonconvex and nonsmooth problems ensuring that for both bounded ...
However, developing efficient methods for the nonconvex and nonsmooth optimization problems with certain performance guarantee remains a challenge. ...
On the other hand regarding the nonsmooth regularization terms, proximal gradient methods often address solving optimization problems with nonsmoothness. ...
doi:10.1609/aaai.v33i01.33011528
fatcat:weexm7jahbapnne5kkieox3k44
An Algorithm with Optimal Dimension-Dependence for Zero-Order Nonsmooth Nonconvex Stochastic Optimization
[article]
2024
arXiv
pre-print
Our analysis is based on a simple yet powerful lemma regarding the Goldstein-subdifferential set, which allows utilizing recent advancements in first-order nonsmooth nonconvex optimization. ...
Moreover, the convergence rate achieved by our algorithm is also optimal for smooth objectives, proving that in the nonconvex stochastic zero-order setting, nonsmooth optimization is as easy as smooth ...
This is in stark contrast to smooth nonconvex optimization, in which optimal stochastic and deterministic methods have disparate complexities on the order of ǫ −4 and ǫ −2 , respectively [Arjevani et ...
arXiv:2307.04504v3
fatcat:alr5vcd6anfpplz73l3zumq64e
Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization
[article]
2022
arXiv
pre-print
Second, we propose the gradient-free method (GFM) and stochastic GFM for solving a class of nonsmooth nonconvex optimization problems and prove that both of them can return a (δ,ϵ)-Goldstein stationary ...
Nonsmooth nonconvex optimization problems broadly emerge in machine learning and business decision making, whereas two core challenges impede the development of efficient solution methods with finite-time ...
Acknowledgments This work was supported in part by the Mathematical Data Science program of the Office of Naval Research under grant number N00014-18-1-2764 and by the Vannevar Bush Faculty Fellowship ...
arXiv:2209.05045v3
fatcat:xvxyhrtcwzg3xpeqq7bkyjbh34
Faster Gradient-Free Algorithms for Nonsmooth Nonconvex Stochastic Optimization
[article]
2024
arXiv
pre-print
We consider the optimization problem of the form min_x ∈ℝ^d f(x) ≜𝔼_ξ [F(x; ξ)], where the component F(x;ξ) is L-mean-squared Lipschitz but possibly nonconvex and nonsmooth. ...
, where Δ = f(x_0) - inf_x ∈ℝ^d f(x) and x_0 is the initial point of the algorithm. ...
[28] considered the nonsmooth nonconvex Table 1 : We present the complexities of finding a (δ, ǫ)-Goldstein stationary point for d-dimensional L-Lipschitz objective under both deterministic and stochastic ...
arXiv:2301.06428v3
fatcat:3d4ddqvyhbhrjhflvdczivycoq
Proximal Stochastic Methods for Nonsmooth Nonconvex Finite-Sum Optimization
2016
Neural Information Processing Systems
Our results are based on the recent variance reduction techniques for convex optimization but with a novel analysis for handling nonconvex and nonsmooth functions. ...
We analyze stochastic algorithms for optimizing nonconvex, nonsmooth finite-sum problems, where the nonsmooth part is convex. ...
Acknowledgment: SS acknowledges support of NSF grant: IIS-1409802. 5 The datasets can be downloaded from https://www.csie.ntu.edu.tw/~cjlin/ libsvmtools/datasets. ...
dblp:conf/nips/ReddiSPS16
fatcat:pctzoj4hifhuhkqpdlg56ia2fi
Asynchronous Stochastic Proximal Methods for Nonconvex Nonsmooth Optimization
[article]
2018
arXiv
pre-print
We implement the proposed algorithms on Parameter Server and demonstrate its convergence behavior and near-linear speedup, as the number of workers increases, on two real-world datasets. ...
However, compared to asynchronous parallel stochastic gradient descent (AsynSGD), an algorithm targeting smooth optimization, the understanding of the behavior of stochastic algorithms for nonsmooth regularized ...
for the nonsmooth regularized optimization problems is quite limited, especially when the objective function is nonconvex. ...
arXiv:1802.08880v3
fatcat:othkev23cjbo7kbnqmm7i2ux4y
The cost of nonconvexity in deterministic nonsmooth optimization
[article]
2022
arXiv
pre-print
We study the impact of nonconvexity on the complexity of nonsmooth optimization, emphasizing objectives such as piecewise linear functions, which may not be weakly convex. ...
Our complexity bound depends on a natural nonconvexity modulus, related, intriguingly, to the negative part of directional second derivatives of the objective, understood in the distributional sense. ...
We relate the nonconvexity modulus of the objective with its distributional second derivative, hinting at an intriguing relationship between such derivatives and algorithmic complexity in general optimization ...
arXiv:2210.00652v2
fatcat:dh6hkenixfcfrowszltwi4n3ve
On the Convergence Rate of Stochastic Mirror Descent for Nonsmooth Nonconvex Optimization
[article]
2018
arXiv
pre-print
We focus on a general class of nonconvex nonsmooth stochastic optimization problems, in which the objective can be decomposed into a relatively weakly convex function (possibly non-Lipschitz) and a simple ...
In this paper, we investigate the non-asymptotic stationary convergence behavior of Stochastic Mirror Descent (SMD) for nonconvex optimization. ...
stochastic subgradient algorithm (Davis and Drusvyatskiy, 2018) for solving nonsmooth nonconvex optimization. ...
arXiv:1806.04781v1
fatcat:w5dlhnad7redlfrig7m64bsnxe
No Dimension-Free Deterministic Algorithm Computes Approximate Stationarities of Lipschitzians
[article]
2022
arXiv
pre-print
Our results reveal a fundamental hurdle of nonconvex nonsmooth problems in the modern large-scale setting and their infinite-dimensional extension. ...
Even without the dimension-free requirement, we show that any finite time guaranteed deterministic method cannot be general zero-respecting, which rules out most of the oracle-based methods in smooth optimization ...
Our results shed light on a fundamental hurdle of nonconvex nonsmooth problems in the modern large-scale setting and their infinite-dimensional extension. ...
arXiv:2210.06907v1
fatcat:wfunnsi5ondpdfgdfminwjmdqi
« Previous
Showing results 1 — 15 out of 1,150 results