We introduce a novel algorithm for solving learning problems where both the loss function and the regularizer are non-convex but belong to the class of difference of convex (DC) functions. Our contribution is a new general purpose proximal Newton algorithm that is able to deal with such a situation. The algorithm consists in obtaining a descent...
-
2016 (v1)Journal articleUploaded on: March 25, 2023
-
October 4, 2019 (v1)Conference paper
Kernel methods have proven to be useful and successful to analyse large-scale multi-omics datasets [Schölkopf et al., 2004]. However, as stated in [Hofmann et al., 2015, Mariette et al., 2017], these methods usually suffer from a lack of interpretability as the information of thousands descriptors is summarized in a few similarity measures,...
Uploaded on: December 4, 2022 -
August 28, 2017 (v1)Conference paper
International audience
Uploaded on: December 3, 2022 -
September 8, 2015 (v1)Publication
In this work, we propose a novel linear classification scheme for non-stationary periodic data. We express the classifier in a temporal basis while regularizing its temporal complexity leading to a convex optimization problem. Numerical experiments show very good results on a simulated example and on real life remote sensing image...
Uploaded on: December 4, 2022 -
September 8, 2018 (v1)Conference paper
In computer vision, one is often confronted with problems of domain shifts, which occur when one applies a classifier trained on a source dataset to target data sharing similar characteristics (e.g. same classes), but also different latent data structures (e.g. different acquisition conditions). In such a situation, the model will perform...
Uploaded on: December 4, 2022