No description
-
2018 (v1)PublicationUploaded on: February 11, 2024
-
2018 (v1)Publication
No description
Uploaded on: February 11, 2024 -
2020 (v1)Publication
Kernel methods provide an elegant and principled approach to nonparametric learning, but so far could hardly be used in large scale problems, since naïve implementations scale poorly with data size. Recent advances have shown the benefits of a number of algorithmic ideas, for example combining optimization, numerical linear algebra and random...
Uploaded on: April 14, 2023 -
2021 (v1)Publication
fA fb fs ft fr fa fc ft. We propose and analyze an accelerated iterative dual diagonal descent algorithm for the solution of linear inverse problems with strongly convex regularization and general data-fit functions. We develop an inertial approach of which we analyze both convergence and stability properties. Using tools from inexact proximal...
Uploaded on: April 14, 2023 -
2020 (v1)Publication
Object detection is a fundamental ability for robots interacting within an environment. While stunningly effective, state-of-the-art deep learning methods require huge amounts of labeled images and hours of training which does not favour such scenarios. This work presents a novel pipeline resulting from integrating (Maiettini et al. in 2017...
Uploaded on: April 14, 2023 -
2022 (v1)Publication
Gaussian process optimization is a successful class of algorithms(e.g. GP-UCB) to optimize a black-box function through sequential evaluations. However, for functions with continuous domains, Gaussian process optimization has to rely on either a fixed discretization of the space, or the solution of a non-convex optimization subproblem at each...
Uploaded on: February 14, 2024 -
2021 (v1)Publication
We introduce ParK, a new large-scale solver for kernel ridge regression. Our approach combines partitioning with random projections and iterative optimization to reduce space and time complexity while provably maintaining the same statistical accuracy. In particular, constructing suitable partitions directly in the feature space rather than in...
Uploaded on: February 14, 2024 -
2022 (v1)Publication
In this note, we provide an elementary analysis of the prediction error of ridge regression with random design. The proof is short and self-contained. In particular, it bypasses the use of Rudelson's deviation inequality for covariance matrices, through a combination of exchangeability arguments, matrix perturbation and operator convexity.
Uploaded on: February 11, 2024 -
2022 (v1)Publication
Orthogonal projections are a standard technique of dimensionality reduction in machine learning applications. We study the problem of approximating orthogonal matrices so that their application is numerically fast and yet accurate. We find an approximation by solving an optimization problem over a set of structured matrices, that we call...
Uploaded on: February 14, 2024 -
2020 (v1)Publication
Gaussian processes (GP) are one of the most successful frameworks to model uncertainty. However, GP optimization (e.g., GP-UCB) suffers from major scalability issues. Experimental time grows linearly with the number of evaluations, unless candidates are selected in batches (e.g., using GP-BUCB) and evaluated in parallel. Furthermore,...
Uploaded on: April 14, 2023 -
2021 (v1)Publication
Object segmentation is a key component in the visual system of a robot that performs tasks like grasping and object manipulation, especially in presence of occlusions. Like many other computer vision tasks, the adoption of deep architectures has made available algorithms that perform this task with remarkable performance. However, adoption of...
Uploaded on: April 14, 2023 -
2023 (v1)Publication
Plankton microorganisms play a huge role in the aquatic food web. Recently, it has been proposed to use plankton as a biosensor, since they can react to even minimal perturbations of the aquatic environment with specific physiological changes, which may lead to alterations in morphology and behavior. Nowadays, the development of high-resolution...
Uploaded on: February 11, 2024 -
2022 (v1)Publication
We provide a comprehensive study of the convergence of the forward-backward algorithm under suitable geometric conditions, such as conditioning or Łojasiewicz properties. These geometrical notions are usually local by nature, and may fail to describe the fine geometry of objective functions relevant in inverse problems and signal processing,...
Uploaded on: May 5, 2023 -
2019 (v1)Publication
No description
Uploaded on: May 13, 2023 -
2023 (v1)Publication
Optimization in machine learning typically deals with the minimization of empirical objectives defined by training data. The ultimate goal of learning, however, is to minimize the error on future data (test error), for which the training data provides only partial information. In this view, the optimization problems that are practically...
Uploaded on: July 3, 2024 -
2020 (v1)Publication
We study reproducing kernel Hilbert spaces (RKHS) on a Riemannian manifold. In particular, we discuss under which condition Sobolev spaces are RKHS and characterize their reproducing kernels. Further, we introduce and discuss a class of smoother RKHS that we call diffusion spaces. We illustrate the general results with a number of detailed...
Uploaded on: April 14, 2023