Published February 25, 2019
| Version v1
Conference paper
QoS and Energy-Aware Run-time Adaptation for Mobile Robotic Missions: A Learning Approach
Contributors
Others:
- Laboratoire d'Electronique, Antennes et Télécommunications (LEAT) ; Université Nice Sophia Antipolis (1965 - 2019) (UNS) ; COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Centre National de la Recherche Scientifique (CNRS)-Université Côte d'Azur (UCA)
- IEEE
Description
Mobile robotic systems are normally confronted with the shortage of on-board resources such as computing capabilities and energy, as well as significantly influenced by the dynamics of surrounding environmental conditions. Thiscontext requires adaptive decisions at run-time that react to the dynamic and uncertain operational circumstances for guaranteeing the performance requirements while respecting the other constraints. In this paper, we propose a reinforcement learning based approach for QoS and energy-aware autonomous robotic mission manager. The mobile robotic mission manager leverages the idea of reinforcement learning by monitoring actively the state of performance and energy consumption of the mission and then selecting the best mapping parameter configuration by evaluating an accumulative reward feedback balancing between QoS and energy. As a case study, we apply this methodology to an autonomous navigation mission. Our simulation results demonstrate the efficiency of the proposed management framework and provide a promising solution for the real mobile robotic systems.
Abstract
International audienceAdditional details
Identifiers
- URL
- https://hal.archives-ouvertes.fr/hal-02018703
- URN
- urn:oai:HAL:hal-02018703v1
Origin repository
- Origin repository
- UNICA