ML p(r)ior | A Decentralized Second-Order Method with Exact Linear Convergence Rate for Consensus Optimization

A Decentralized Second-Order Method with Exact Linear Convergence Rate for Consensus Optimization

2016-02-01
This paper considers decentralized consensus optimization problems where different summands of a global objective function are available at nodes of a network that can communicate with neighbors only. The proximal method of multipliers is considered as a powerful tool that relies on proximal primal descent and dual ascent updates on a suitably defined augmented Lagrangian. The structure of the augmented Lagrangian makes this problem non-decomposable, which precludes distributed implementations. This problem is regularly addressed by the use of the alternating direction method of multipliers. The exact second order method (ESOM) is introduced here as an alternative that relies on: (i) The use of a separable quadratic approximation of the augmented Lagrangian. (ii) A truncated Taylor's series to estimate the solution of the first order condition imposed on the minimization of the quadratic approximation of the augmented Lagrangian. The sequences of primal and dual variables generated by ESOM are shown to converge linearly to their optimal arguments when the aggregate cost function is strongly convex and its gradients are Lipschitz continuous. Numerical results demonstrate advantages of ESOM relative to decentralized alternatives in solving least squares and logistic regression problems.
PDF

Highlights - Most important sentences from the article

Login to like/save this paper, take notes and configure your recommendations

Related Articles

2017-07-20

Due to their simplicity and excellent performance, parallel asynchronous variants of stochastic grad… show more
PDF

Highlights - Most important sentences from the article

2016-02-12
1602.03943 | stat.ML

First-order stochastic methods are the state-of-the-art in large-scale machine learning optimization… show more
PDF

Highlights - Most important sentences from the article

2019-02-13
1902.04952 | stat.ML

Subsampled Newton methods approximate Hessian matrices through subsampling techniques, alleviating t… show more
PDF

Highlights - Most important sentences from the article

2018-10-05

In this paper, we study the problem of minimizing a sum of smooth and strongly convex functions spli… show more
PDF

Highlights - Most important sentences from the article

2018-02-05
1802.01504 | math.OC

We consider the convex-concave saddle point problem $\min_{x}\max_{y} f(x)+y^\top A x-g(y)$ where $f… show more
PDF

Highlights - Most important sentences from the article

2018-09-25

In this paper, we propose an interior-point method for linearly constrained optimization problems (p… show more
PDF

Highlights - Most important sentences from the article

2018-09-30

Establishing global convergence of Newton-CG has long been limited to making strong convexity assump… show more
PDF

Highlights - Most important sentences from the article

2018-04-08
1804.02729 | math.OC

We consider a class of distributed non-convex optimization problems often arises in modern distribut… show more
PDF

Highlights - Most important sentences from the article

2015-11-10
1511.02942 | math.OC

In this note, we extend the algorithms Extra and subgradient-push to a new algorithm ExtraPush for c… show more
PDF

Highlights - Most important sentences from the article

2019-01-24
1901.08958 | cs.LG

We consider the problem of finding local minimizers in non-convex and non-smooth optimization. Under… show more
PDF

Highlights - Most important sentences from the article

2019-01-24
1901.08523 | math.OC

This paper introduces an efficient second-order method for solving the elastic net problem. Its key … show more
PDF

Highlights - Most important sentences from the article

2017-09-25
1709.08360 | cs.SY

This paper proposes distributed discrete-time algorithms to cooperatively solve an additive cost opt… show more
PDF

Highlights - Most important sentences from the article

2019-01-24
1901.08215 | cs.DC

A popular asynchronous protocol for decentralized optimization is randomized gossip where a pair of … show more
PDF

Highlights - Most important sentences from the article

2018-12-10

Network consensus optimization has received increasing attention in recent years and has found impor… show more
PDF

Highlights - Most important sentences from the article

2018-09-05
1809.01275 | math.OC

We propose a new primal-dual homotopy smoothing algorithm for a linearly constrained convex program,… show more
PDF

Highlights - Most important sentences from the article

2018-05-27

We study the trade-offs between convergence rate and robustness to gradient errors in designing a fi… show more
PDF

Highlights - Most im