We consider the problem of recovering elements of a low-dimensional model from under-determined linear measurements. To perform recovery, we consider the minimization of a convex regularizer subject to a data fit constraint. Given a model, we ask ourselves what is the ``best'' convex regularizer to perform its recovery. To answer this question,...
-
December 12, 2022 (v1)PublicationUploaded on: February 22, 2023
-
December 6, 2021 (v1)Publication
We consider the problem of recovering elements of a low-dimensional model from under-determined linear measurements. To perform recovery, we consider the minimization of a convex regularizer subject to a data fit constraint. Given a model, we ask ourselves what is the "best" convex regularizer to perform its recovery. To answer this question,...
Uploaded on: December 3, 2022 -
August 21, 2021 (v1)Journal article
We provide statistical learning guarantees for two unsupervised learning tasks in the context of compressive statistical learning, a general framework for resource-efficient large-scale learning that we introduced in a companion paper.The principle of compressive statistical learning is to compress a training collection, in one pass, into a...
Uploaded on: July 4, 2023 -
August 21, 2021 (v1)Journal article
We describe a general framework --compressive statistical learning-- for resource-efficient large-scale learning: the training collection is compressed in one pass into a low-dimensional sketch (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. A near-minimizer of the risk...
Uploaded on: December 4, 2022