Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Nov 28, 2015 · We develop a dynamic stochastic proximal-gradient consensus (DySPGC) algorithm, with the following key features: i) it works for both the static ...
Feb 23, 2017 · We consider solving a convex optimization problem with possibly stochastic gradient, and over a randomly time-varying multiagent network.
Abstract—We consider solving a convex optimization problem with possibly stochastic gradient, and over a randomly time- varying multi-agent network.
A dynamic stochastic proximal-gradient consensus algorithm that works for both the static and certain randomly time-varying networks; it allows the agents ...
We develop a dynamic stochastic proximal-gradient consensus (DySPGC) algorithm, with the following key features: i) it works for both the static and certain ...
The ADMM for Network Consensus (cont.) The above ... (c) Except [Wei-Ozdaglar 13], random graph. Mingyi ... Case 2: Stochastic Gradient with Static Graph (cont.).
We develop a dynamic stochastic proximal-gradient consensus algorithm, with the following key features: 1) it works for both the static and certain randomly ...
Bibliographic details on Stochastic Proximal Gradient Consensus Over Random Networks.
Despite the non-convex nature of their loss functions, deep neural networks are known to generalize well when optimized with stochastic gradient descent (SGD).
Chang, "Stochastic proximal gradient consensus over time-varying networks, " in Proc. IEEE Int. Conf. Acoust., Speech Signal Process., 2016, pp. 4776-4780. [ ...