Published 2020
| Version v1
Publication
Thresholding gradient methods in Hilbert spaces: support identification and linear convergence
Contributors
Description
We study the l(1) regularized least squares optimization problem in a separable Hilbert space. We show that the iterative soft-thresholding algorithm (ISTA) converges linearly, without making any assumption on the linear operator into play or on the problem. The result is obtained combining two key concepts: the notion of extended support, a finite set containing the support, and the notion of conditioning over finite-dimensional sets. We prove that ISTA identifies the solution extended support after a finite number of iterations, and we derive linear convergence from the conditioning property, which is always satisfied for l(1) regularized least squares problems. Our analysis extends to the entire class of thresholding gradient algorithms, for which we provide a conceptually new proof of strong convergence, as well as convergence rates.
Additional details
Identifiers
- URL
- https://hdl.handle.net/11567/1010408
- URN
- urn:oai:iris.unige.it:11567/1010408
Origin repository
- Origin repository
- UNIGE