Projected Sub-Gradient with ℓ 1 or Simplex Constraints via Isotonic Regression

Abstract: 

We consider two classic problems in convex optimization: 1) minimizing a convex objective over the nonnegative orthant of the ℓ1-ball and 2) minimizing a convex objective over the probability simplex. We propose an efficient and simple equality constraint elimination technique which converts the ℓ1 and simplex constraints into order constraints. We formulate the projection onto the feasible set as an isotonic regression problem, which can be solved exactly in O(n) time via the Pool Adjacent Violators Algorithm (PAVA), where n is the dimension of the space. We design a C++ implementation of PAVA up to 25,000 times faster than scikit-learn. Our PAVA-based projection step enables the design of efficient projected subgradient methods which compare well against projected algorithms using direct projections onto the ℓ1-ball and onto the simplex, with projection in O(nlog(n)) exact time and O(n) expected time. Interestingly, our technique is particularly well adapted to learning from sparse, skewed, or aggregated data, by decreasing the cross-correlations between data points.

Author: 
Thai, Jérôme
Wu, Cathy
Pozdnukhov, Alexey
Bayen, Alexandre
Publication date: 
January 1, 2015
Publication type: 
Conference Paper
Citation: 
Thai, J., Wu, C., Pozdnukhov, A., & Bayen, A. (2015). Projected Sub-Gradient with ℓ 1 or Simplex Constraints via Isotonic Regression. 2015 54th IEEE Conference on Decision and Control (CDC), 2031–2036. https://ieeexplore.ieee.org/abstract/document/7402505/