Published May 4, 2020
| Version v1
Conference paper
Proximal Multitask Learning over Distributed Networks with Jointly Sparse Structure
- Creators
- Jin, Danqi
- Chen, Jie
- Chen, Jingdong
- Richard, Cédric
- Others:
- Northwestern Polytechnical University [Xi'an] (NPU)
- Joseph Louis LAGRANGE (LAGRANGE) ; Université Nice Sophia Antipolis (1965 - 2019) (UNS) ; COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Institut national des sciences de l'Univers (INSU - CNRS)-Observatoire de la Côte d'Azur ; COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Université Côte d'Azur (UCA)-Université Côte d'Azur (UCA)-Centre National de la Recherche Scientifique (CNRS)
- ANR-19-CE48-0002,DARLING,Adaptation et apprentissage distribués pour les signaux sur graphe(2019)
- ANR-19-P3IA-0002,3IA@cote d'azur,3IA Côte d'Azur(2019)
Description
Modeling relations between local optimum parameter vectors in multitask networks has attracted much attention over the last years. This work considers a distributed optimization problem for parameter vectors with a jointly sparse structure among nodes, that is, the parameter vectors share the same support set. By introducing an L∞,1-norm penalty at each node, and using a proximal gradient method to minimize the regularized cost, we devise a proximal multitask diffusion LMS algorithm which promotes the joint-sparsity to enhance the estimation performance. Analyses are provided to ensure the stability. Simulation results are presented to highlight the performance.
Abstract
International audience
Additional details
- URL
- https://hal.archives-ouvertes.fr/hal-03347335
- URN
- urn:oai:HAL:hal-03347335v1
- Origin repository
- UNICA